Welcome!

Linux Authors: Ignacio M. Llorente, Trevor Parsons, Tad Anderson, Andrew Phillips, Pat Romanski

Related Topics: Linux

Linux: Article

Changes in Computing

Milestones in Linux desktop adoption

In my editorial in last month's LinuxWorld Magazine, I defined inflection points (with the help of Intel's Andy Grove) as those things that change our behavior with regards to our IT strategy. I was speaking of software and hardware upgrades and other realities of running a business that make us consider other options before investing in our IT infrastructure. Typically, this means purchasing newer versions of the same solutions we've been using.

I believe many of these proceedings are on the horizon, but the question is, how deep is that horizon? For example, every year since 2000, I've heard some industry pundit, analyst, or Linux evangelist talk about it being "The Year of the Linux Desktop," and to the disappointment of many, that has not yet happened. While the adoption of Linux as a desktop platform is growing, there is no mass migration to Linux for the productivity desktop. That's not to say that there aren't a substantial number of Linux desktop users. To cite Google Zeitgeist, May 2004, (www.google.com/press/zeitgeist.html) only 1% of the operating systems visiting Google were running Linux. In contrast, 91% of the operating systems accessing Google during the month of May were running a version of Microsoft Windows. That's quite a gap, but when you consider the speed at which Linux has garnered that 1%, you have to wonder how long it will be before it overtakes Mac at 3%. After all, the Mac has been around since 1984, and according to this Google reporting, it still only controls a small part of the market (see Figure 1).

For Linux to acquire greater market share, something has to make Linux more compelling as a desktop OS. That's why I have chosen to predict, or at least identify, trends that may move us closer to widespread Linux desktop use, or at least signal Linux as an even better opportunity than it is today. Early Linux desktop adopters may find that adoption in a configuration that's slightly different from the way they're using PCs today would serve them well in either a Linux or a Windows world.

Personal Data Storage in the Network

Because the migration of operating systems often includes the migration of data, it may be time to rethink where we store the data. If you are going to migrate to Linux and have to move your data anyway, maybe it's time to consider storing that data in the network. That way, whenever you change PCs or need to be mobile, your data is accessible.

Today, we mostly store our personal data on PCs, which may or may not be backed up to the network. Over the long term, you may not store data locally, which is a tough pill for many people to swallow –the idea of your data being somewhere at the end of a wire that you have no control of is pretty daunting. After all, your data may contain not only financial records but also personal correspondence, pictures from a family vacation, and other precious items that, if lost, would not only cause you great inconvenience but could expose you to other dangers, such as identity theft. However, there are services out there that I would never have considered 10 years ago that today make more sense.

One of the more sensational services is Google's Gmail (http://gmail.google.com) e-mail service, which allows you to store up to 1GB of e-mail on Google's servers in exchange for your allowing them to advertise to you based on the content of your e-mail. This raises many privacy concerns. I don't believe it is Google's intent to look into your private life and supply that information to the world, but whenever a mechanism like that exists, so does the opportunity for that system to be compromised or abused. I have been playing with my Gmail account for about a week now and find it to be very useful; I would also consider paying for such a service if it included a service-level guarantee and amenities like SPAM filtering. Interestingly, Google seems to be banking on the advertising revenues generated by this service to subsidize the costs if not make it a profit center. More to the point, leaving your e-mail and other data in someone else's hands is analogous to leaving your money in the bank; you need to have some level of trust in the validity and security of the system for you to leave your money in the hands of a stranger, and the same trust must exist for you to do the same with your data. Once your data is in a position to be accessed over a network, that sets the stage for it to be accessed by application service providers (ASPs), which is when your daily computing environment could really change, whether it's on Windows or Linux.

Greater Availability of Robust Applications

There are a great number of applications available for Linux, and there's an adequate software package for about 80% of all productivity computing tasks. For widespread Linux adoption, the remaining 20% of the functionality needs to be addressed. In the long term, that need will be addressed by one or both of the following methods; in the short term, bridging strategies will be necessary, as discussed later in this article.

Applications Become Web Services

Far-reaching application availability may not come in the same package we are familiar with today. In a recent article alluded to in Red Hat's Under the Brim newsletter, Red Hat chief Matthew Szulik talked about how his vision was that, in five years, your calendar, e-mail, and word processor will become Web services and the infrastructure for those Web services will be provided by open source technologies. While I whole-heartedly agree that his vision is correct in the very long term, I suspect that day is farther out than five years. Perhaps one of the early indicators that Web services have a future is our growing dependency on the Internet for content like news, driving directions, and banking. The evolution of content provided through this medium could be productivity applications, but for that to happen, we need to have pervasive Internet access. Inklings of this are starting to emerge with the availability of data services on our mobile phones and wireless hot spots at coffee houses around the globe, but omnipresent access to rich content through wireless networks is still not a reality in much of the world. My point is this: some of your critical applications for Linux desktop adoption may exist today but they exist in the form of Web services. For example, TurboTax (www.turbotax.com) offers a Web version of their software so that you can be platform independent in using their services. Perhaps some of your other key applications will make their way to the Web as well, making your productivity desktop less platform dependent. With that dependency solved, Linux becomes a more attractive choice, especially when you look at the administration and software licensing advantages.

Major ISVs Developing Native Linux Applications

The Web services vision for Linux applications is fairly futuristic and we've spent the past 20 years becoming endeared to our PCs, so I do believe that there will always be a need for applications running locally on your PC. Trusting applications and data to someone else requires a huge leap of faith. That's why people are (understandably)hesitant when adopting Linux for use in personal finance and accounting. As one accounting software manufacturer recently told me, once someone adopts an accounting package, they rarely ever change it just because they are putting a lot at risk by moving, as well as being inconvenienced by downtime and data migration. However, these same "bean counters" are also well aware of the costs of their IT infrastructure, and given an opportunity for considerably better Total Cost of Ownership (TCO) of IT infrastructure, they are apt to evaluate applications that run natively on Linux.

Widespread enterprise implementations of desktop Linux will most likely have to pass through the CFO's office, and many CFOs will want to know if their applications run on the same platform as everyone else's. That's why financial applications for Linux are going to be a consideration for desktop adoption even if for more political than practical reasons. This may be a seemingly insignificant part of the Linux desktop equation, but sometimes Linux adoption will be driven not only by facts but also by perception. It's not uncommon for me to hear Linux diehards comment that they normally use Linux but they still have one Windows PC with Quicken and/or TurboTax to supplement their Linux desktop. That's because there is no Linux application that instills the widespread confidence to handle their fiscal information. But financial packages are just the tip of the iceberg; when a wide variety of applications that are easily installed and supported become available to solve that last 20% of user needs, then we will really begin to see increased desktop adoption.

Expertise and Vendor-Neutral Management Tools

Once Linux is installed on a PC, performance is usually excellent and in my experience downtime is rare. Very rare, that is, until a change occurs; that change could be an update to the OS kernel or the addition of new hardware. Increases in productivity due to less downtime can be quickly eaten up by administration overhead.

It's not adequate to just have tools and solutions that can get the get the job done; it's more important for those tools to be leveraged by existing personnel who can quickly gain the expertise to administer Linux systems. The learning curve needs to be minimized wherever possible. Intuitive, easy-to-learn tools are an essential part of successful migration strategies.

An emerging Linux software company, Open Country (www.opencountry.com), spoke to me about their vision for managing Linux deployments. Since Linux is an open system they want to provide the tools to manage the network. Open Country's COO, Laurent Gharda, says, "Providing an easy-to-use system management solution is of great importance to deploy and manage Linux for those companies that have little Linux expertise or expertise in another operating system." Not only is it important for that tool to exist, but for it to be vendor neutral. Mike Grove, Open Country's CEO, adds "We believe being vendor neutral is important so that customers aren't locked into one distribution over another." They believe that their OC-Manager management console will allow administrators to overcome the two underserved items when it comes to Linux system administration: management complexity and software unpredictability. Their success remains to be seen, but as more vendors concentrate on the usability aspects of the use and administration of the Linux desktop, uptake in Linux adoption will increase.

A company thinking along the same lines (though not in terms of vendor neutrality) is Xandros (www.xandros.com), makers of the popular Xandros Business Desktop. They have announced the development of the Xandros Desktop Management Server, which is intended to allow for deployment and management of Xandros desktops. While their solution is aimed at Xandros desktops, it's another example of mass deployment tools that will be developed to try to aid adoption in the enterprise. All of these early efforts are worth watching, as they could be potential aids to your successful migration to a Linux desktop platform.

Breaking Vendor Dependencies

It's time for you to not only re-evaluate operating systems, but also to evaluate new technologies from a platform-neutral perspective. I don't expect anyone to toss their existing infrastructure in favor of a brand new OS; costs of migration right off the bat could be more substantial than using an incumbent solution. However, if you are evaluating a new technology, it's wise to consider whether it will lock you into a dependency on the operating system or browser for Web applications. Even if you don't adopt Linux today, you should consider how purchasing something today will affect your choices tomorrow. Any time you have options to make multiple choices on a solution, you can negotiate prices and make sure that the services and supporting features are best suited for your needs. There is nothing more frustrating than buyer's remorse; you may want to reflect on these points when you begin to evaluate a Linux migration.

In the Meantime: Staging a Linux Migration

There are few things you can do to prepare for Linux as a desktop platform.

Build Internal Expertise

While you can't spend too much time investigating every new technology that comes on the scene, building expertise within your organization in preparation for serious Linux evaluations is a good investment. It's a good investment for two reasons: you may realize the cost savings Linux can provide you, or it may save you from migrating to Linux in a way that costs you more than it saves. Additionally, this will help you understand what would make good Linux pilots and will help you make informed decisions when it's time to come to a verdict on Linux as a solution. Plus, Linux server expertise, which is a very good investment today, should translate to the desktop when you make that commitment tomorrow.

Develop a Bridging Strategy

Identify solutions that help bridge the gap between ideal Linux solutions and what's workable today. These bridges would include any of the following:
  • Cross-platform applications: Areas for evaluation would include office suites that can be implemented on existing systems before a large scale OS migration, e.g., Mozilla Web browsers (www.mozilla.org) and Sun's Star Office (www.sun.com/staroffice).
  • Emulators and virtual machines: Recognize those absolutely essential applications you use today in your Windows environment and look at the potential of migrating them intact to Linux. There are a variety of ways to do this via a virtual machine or emulated environment in which the Windows application is executed as intended, but in a virtual machine running on Linux or through an emulated Windows API. The solutions to consider for this include VMWare (www.vmware.com), Win4Lin (www.win4lin.com), and Wine (www.winehq.com).
  • Terminal servers: Another approach to filling the gap between widespread application availability and the current situation is to host and redisplay critical applications via a terminal server and redisplay them to Linux desktops. Applications can even be running on Microsoft Windows servers via Windows Terminal Server and redisplayed to Linux desktops via solutions like Citrix (www.citrix.com), Tarantella (www.tarantella.com), and GraphOn (www.graphon.com). However, this is probably the least attractive way to migrate because it requires a mixed operating system footprint.

Evaluate Your Existing Infrastructure

Take inventory of what is going on in your enterprise today. Develop metrics that give you a snapshot of how events like PC downtime affect your bottom line.
  • Look at actual costs of IT systems: Cost involves a lot more than software licensing and hardware. There are costs associated with viruses, spam, and the resulting downtime and administrative overhead to deal with them. Particularly, evaluate the potential for open source solutions to reduce these costs. In this scenario, it would be wise to look at any solutions that are subject to viruses or require costly closed source software to maintain.
  • Determine usable hardware life: The city of Munich, Germany, has received a lot of media attention since announcing their intention to move to Linux as a desktop platform. One of the deciding factors for this was the costs associated with upgrading their hardware to support newer versions of the Windows operating system. Their problems aren't unique, and you may find that users' computing needs aren't changing as fast as hardware is being improved or software versions are coming out. Decide if upgrading to the latest and greatest is going to offer more value than older hardware and a Linux operating system.
  • Determine user needs: I have this dilemma every year when new cars come out. I often think how nice it would be to have the latest model equipped with the latest and greatest gadgets. However, when I evaluate my needs versus my wants, I find that very seldom can I find justification for such a purchase. IT equipment is very much the same; even though faster processors, newer operating systems, and and newer applications have been released, the needs of the end user are probably very similar to what they where when older equipment was first purchased. If end of support life is a factor in your decision, make sure that you don't overbuy during your technology refresh.

Conclusion

I would like to say with 100% confidence that anyone reading this article could make the move to Linux on the desktop today. However, that wouldn't be true –there are many factors limiting the adoption of Linux, but those factors are becoming fewer as the days go by. My recommendation is that, if you are interested in a Linux enterprise and you have objections that can't be overcome today, that you watch as more milestones are reached. Also, talk to vendors you may already trust who have a Linux strategy today (Novell, IBM, and Sun come to mind). Then choose the one whose products are most complementary to your objectives and start to formulate a bridging strategy to successfully migrate from one platform to another.

I advise this much time in evaluating one technology largely because open source initiatives are probably the most "disruptive technologies" that have the grandest potential to be successful at all levels of computing use. In the upcoming years, Linux advancement in both server and desktop applications could have the potential to help control one of your costly expenditures as well as enable you to have more freedom in your technology environments. By increasing productivity and providing competition in the desktop computing arena, there can be nothing but good things to come.

Tomorrow's Linux desktop may focus less on the client and more on the network. The ecosystem shown in Figure 1 would be conducive to Linux as an end-user platform.

-SIDEBAR-

Linux Migration Success Stories

Both LilleCorp and Addison UK are having tremendous success promoting Linux on the desktop, they are doing so by following an easier-to-manage thin-client strategy.

LilleCorp
Jordan Rosen, CEO of LilleCorp (www.lillecorp.com), a systems integrator in Albany, New York, has identified that the healthcare industry is a prime candidate for Linux solutions. They have proven that thin-client Linux solutions are helping Capital Cardiology Associates, a medical provider utilizing Linux IT infrastructure with many benefits. Their solution is to provide thin-client computing architecture powered by Linux that minimizes cost through reducing the need for IT services and, more important, lost productivity due to computer downtime. Their network-based solution allows computing environments to be portable while still having a single tightly controlled point of deployment. This high level of control allows them to maintain integrity of the operating system, minimizing downtime caused by user errors. Additionally, they profit from the fact that the Linux OS is somewhat less affected by viruses due to the disparity in the number of viruses that attack Windows systems versus Linux.

SchoolLinux
SchoolLINUX.com (www.schoollinux.com) is a division of Addison UK, a developer of Linux solutions for the education and small business. They focus on creating easy-to-use servers to provide back-office infrastructure and desktops that stand up to heavy multi-user traffic, which is typical in a school with a shared workstation environment. Their approach is to provide easy-to-use ICT technologies to schools in the UK that are both cost effective and useful. Their Linux-powered solutions are all ideal solutions for schools who before were dependent on Windows solutions only.

More Stories By Mark R. Hinkle

Mark Hinkle is the Senior Director, Open Soure Solutions at Citrix. He also is along-time open source expert and advocate. He is a co-founder of both the Open Source Management Consortium and the Desktop Linux Consortium. He has served as Editor-in-Chief for both LinuxWorld Magazine and Enterprise Open Source Magazine. Hinkle is also the author of the book, "Windows to Linux Business Desktop Migration" (Thomson, 2006). His blog on open source, technology, and new media can be found at http://www.socializedsoftware.com.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
Scott Jenson leads a project called The Physical Web within the Chrome team at Google. Project members are working to take the scalability and openness of the web and use it to talk to the exponentially exploding range of smart devices. Nearly every company today working on the IoT comes up with the same basic solution: use my server and you'll be fine. But if we really believe there will be trillions of these devices, that just can't scale. We need a system that is open a scalable and by using the URL as a basic building block, we open this up and get the same resilience that the web enjoys.
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, discussed single-value, geo-spatial, and log time series data. By focusing on enterprise applications and the data center, he will use OpenTSDB as an example t...
P2P RTC will impact the landscape of communications, shifting from traditional telephony style communications models to OTT (Over-The-Top) cloud assisted & PaaS (Platform as a Service) communication services. The P2P shift will impact many areas of our lives, from mobile communication, human interactive web services, RTC and telephony infrastructure, user federation, security and privacy implications, business costs, and scalability. In his session at @ThingsExpo, Robin Raymond, Chief Architect at Hookflash, will walk through the shifting landscape of traditional telephone and voice services ...
The Domain Name Service (DNS) is one of the most important components in networking infrastructure, enabling users and services to access applications by translating URLs (names) into IP addresses (numbers). Because every icon and URL and all embedded content on a website requires a DNS lookup loading complex sites necessitates hundreds of DNS queries. In addition, as more internet-enabled ‘Things' get connected, people will rely on DNS to name and find their fridges, toasters and toilets. According to a recent IDG Research Services Survey this rate of traffic will only grow. What's driving t...
Enthusiasm for the Internet of Things has reached an all-time high. In 2013 alone, venture capitalists spent more than $1 billion dollars investing in the IoT space. With "smart" appliances and devices, IoT covers wearable smart devices, cloud services to hardware companies. Nest, a Google company, detects temperatures inside homes and automatically adjusts it by tracking its user's habit. These technologies are quickly developing and with it come challenges such as bridging infrastructure gaps, abiding by privacy concerns and making the concept a reality. These challenges can't be addressed w...
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at Internet of @ThingsExpo, James Kirkland, Chief Architect for the Internet of Things and Intelligent Systems at Red Hat, described how to revolutioniz...
Bit6 today issued a challenge to the technology community implementing Web Real Time Communication (WebRTC). To leap beyond WebRTC’s significant limitations and fully leverage its underlying value to accelerate innovation, application developers need to consider the entire communications ecosystem.
The definition of IoT is not new, in fact it’s been around for over a decade. What has changed is the public's awareness that the technology we use on a daily basis has caught up on the vision of an always on, always connected world. If you look into the details of what comprises the IoT, you’ll see that it includes everything from cloud computing, Big Data analytics, “Things,” Web communication, applications, network, storage, etc. It is essentially including everything connected online from hardware to software, or as we like to say, it’s an Internet of many different things. The difference ...
Cloud Expo 2014 TV commercials will feature @ThingsExpo, which was launched in June, 2014 at New York City's Javits Center as the largest 'Internet of Things' event in the world.
SYS-CON Events announced today that Windstream, a leading provider of advanced network and cloud communications, has been named “Silver Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York, NY. Windstream (Nasdaq: WIN), a FORTUNE 500 and S&P 500 company, is a leading provider of advanced network communications, including cloud computing and managed services, to businesses nationwide. The company also offers broadband, phone and digital TV services to consumers primarily in rural areas.
"There is a natural synchronization between the business models, the IoT is there to support ,” explained Brendan O'Brien, Co-founder and Chief Architect of Aria Systems, in this SYS-CON.tv interview at the 15th International Cloud Expo®, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
The major cloud platforms defy a simple, side-by-side analysis. Each of the major IaaS public-cloud platforms offers their own unique strengths and functionality. Options for on-site private cloud are diverse as well, and must be designed and deployed while taking existing legacy architecture and infrastructure into account. Then the reality is that most enterprises are embarking on a hybrid cloud strategy and programs. In this Power Panel at 15th Cloud Expo (http://www.CloudComputingExpo.com), moderated by Ashar Baig, Research Director, Cloud, at Gigaom Research, Nate Gordon, Director of T...
An entirely new security model is needed for the Internet of Things, or is it? Can we save some old and tested controls for this new and different environment? In his session at @ThingsExpo, New York's at the Javits Center, Davi Ottenheimer, EMC Senior Director of Trust, reviewed hands-on lessons with IoT devices and reveal a new risk balance you might not expect. Davi Ottenheimer, EMC Senior Director of Trust, has more than nineteen years' experience managing global security operations and assessments, including a decade of leading incident response and digital forensics. He is co-author of t...

ARMONK, N.Y., Nov. 20, 2014 /PRNewswire/ --  IBM (NYSE: IBM) today announced that it is bringing a greater level of control, security and flexibility to cloud-based application development and delivery with a single-tenant version of Bluemix, IBM's platform-as-a-service. The new platform enables developers to build ap...

Technology is enabling a new approach to collecting and using data. This approach, commonly referred to as the "Internet of Things" (IoT), enables businesses to use real-time data from all sorts of things including machines, devices and sensors to make better decisions, improve customer service, and lower the risk in the creation of new revenue opportunities. In his General Session at Internet of @ThingsExpo, Dave Wagstaff, Vice President and Chief Architect at BSQUARE Corporation, discuss the real benefits to focus on, how to understand the requirements of a successful solution, the flow of ...
The security devil is always in the details of the attack: the ones you've endured, the ones you prepare yourself to fend off, and the ones that, you fear, will catch you completely unaware and defenseless. The Internet of Things (IoT) is nothing if not an endless proliferation of details. It's the vision of a world in which continuous Internet connectivity and addressability is embedded into a growing range of human artifacts, into the natural world, and even into our smartphones, appliances, and physical persons. In the IoT vision, every new "thing" - sensor, actuator, data source, data con...
"BSQUARE is in the business of selling software solutions for smart connected devices. It's obvious that IoT has moved from being a technology to being a fundamental part of business, and in the last 18 months people have said let's figure out how to do it and let's put some focus on it, " explained Dave Wagstaff, VP & Chief Architect, at BSQUARE Corporation, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Focused on this fast-growing market’s needs, Vitesse Semiconductor Corporation (Nasdaq: VTSS), a leading provider of IC solutions to advance "Ethernet Everywhere" in Carrier, Enterprise and Internet of Things (IoT) networks, introduced its IStaX™ software (VSC6815SDK), a robust protocol stack to simplify deployment and management of Industrial-IoT network applications such as Industrial Ethernet switching, surveillance, video distribution, LCD signage, intelligent sensors, and metering equipment. Leveraging technologies proven in the Carrier and Enterprise markets, IStaX is designed to work ac...
C-Labs LLC, a leading provider of remote and mobile access for the Internet of Things (IoT), announced the appointment of John Traynor to the position of chief operating officer. Previously a strategic advisor to the firm, Mr. Traynor will now oversee sales, marketing, finance, and operations. Mr. Traynor is based out of the C-Labs office in Redmond, Washington. He reports to Chris Muench, Chief Executive Officer. Mr. Traynor brings valuable business leadership and technology industry expertise to C-Labs. With over 30 years' experience in the high-tech sector, John Traynor has held numerous...
The 3rd International @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that it is now accepting Keynote Proposals. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devices - computers, smartphones, tablets, and sensors - connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades.