Welcome!

Linux Authors: Jayaram Krishnaswamy, AppDynamics Blog, Michael Jannery, Pat Romanski, Carmen Gonzalez

Related Topics: Linux

Linux: Article

Getting Down to Business with Linux

On the Move

With the recent release of SuSE 9.2 Professional and Novell's Linux Desktop operating systems, the Linux desktop is ready to compete with Microsoft Windows for client-side computing in a business environment. I know this is something of a debate among many, but the opportunities for Linux to garner a respectable percentage of the desktop market is not unreasonable if done correctly and the cost savings of using Linux on the desktop could easily exceed the savings realized by replacing just the server components of your infrastructure.

Linux desktop adoption will occur in waves. The first wave or opportunity will be in areas of specific-purpose computing. Those areas where the client primarily runs a specific application such as retail point-of-sale, government, scientific, manufacturing and applications in the medical field. For the most part, these client machines are not leading-edge Pentium 4-class machines and they do not need a fully configured office suite. For these environments the most important requirements are to run on modestly configured hardware and be reliable. Linux is perfect for these scenarios.

The next wave of opportunity will be in branch offices and remote facilities where security and reliability are important. The current OpenOffice suite on Linux is quite good and serves most people's needs. I travel a lot and my primary desktop is SuSE Linux 9.2 Professional, OpenOffice 1.1 and Ximian Evolution for my mail, contacts and task management. I've found Linux to be a very productive environment in many ways, but in particular, its lack of vulnerability to the myriad Windows worms, viruses and other security breaches has proven to be more productive than I originally thought. I can attest to this productivity gain because I'm able to keep working while others around me are running around trying to find the latest virus scanner to fix their Microsoft Windows desktops.

Moving to Linux

Linux is becoming much more attractive on the desktop with the delays of Microsoft's Longhorn until late 2006. Most realize that Longhorn also forces a rewrite of all the existing Windows applications - even those that were written to the Microsoft .NET standard. The cost of these rewrites plus the cost of upgrading most hardware to run the new Avalon GUI are major issues facing all IT managers. So, obviously people are evaluating the move to Linux sooner rather than later as organizations look to maximize their investments in existing software development, desktop hardware and infrastructure.

Tips on Getting to Linux Successfully

1.  Identify a specific area of business that will benefit from migrating to Linux keeping in mind that Linux will bring much higher levels of reliability and security and will support modestly configured hardware. This is where cost savings will be maximized and Linux will provide a substantial return on your investment. Avoid the pitfall of biting off more than you can chew. Find a subset of an application that can benefit from being ported to Linux. Don't try to rewrite your entire ERP system, maybe just the warehouse management component.

2.  Evaluate all your options before you start coding. Before trying to rewrite the existing application to run natively on Linux, you may want to test the Linux waters and take an interim step by looking at WINE as a viable alternative. WINE is attractive because it lets you execute existing Windows applications on Linux with minimal changes.

This is another heated debate in the Linux world - using WINE (Wine Is Not an Emulator) versus writing a native application for Linux. For those in the Linux world that believe nothing but a native application will do - get over it. The major inhibitor to adoption of Linux on the desktop is the lack of business applications, WINE helps accelerate the migration of existing business applications to Linux and, in the end, that's what is important.

By using WINE, you can avoid having to port the entire application to run natively on Linux - which is a monumental task. While most Linux distributions ship with the current stable build of WINE, let's not forget Code-Weavers' CrossOver commercial WINE technology. I've found Jeremy White at CodeWeavers to be very accommodating in addressing specific areas where WINE was lacking, and installing CrossOver can be silent and painless to the end user. Obviously, using WINE is only a stopgap measure that lets you test the Linux waters without significant cost. Ultimately, you want to have a true cross-platform application.

3.  Avoid writing an application that will only run on one operating system. Evaluate the existing application and its future in your organization. You do not want to write the application in a language that locks you into one platform, but one that will compile and run on either a Microsoft Windows or Linux platform. This improves your return on investment because you have a single source code line that can be compiled on either platform. In this way the application is easily maintained and simultaneous updates for both platforms is easier to manage. Whenever possible, choose a high-level language that offers a level of abstraction to relieve your team from having to trap every possible message or manag- ing every pixel on the screen. A high-level language lets you focus on the business case - not the nits of the operating system and its myriad function calls. A high-level language that compiles to either Windows or Linux will take care of the grunt work and improve your productivity.

4.  Choose a language that fully supports robust object-oriented coding practices. As you write the application you will want to create business processes in classes isolated from the user interface. This will provide maximum flexibility in the future since parts of the application may be suited to a rich-client model and some may be better suited to a browser-based model. By encapsulating the business logic in functional classes you have the flexibility to do either or both in the same application.

5.  Accelerate your development by choosing an integrated development environment that provides not just a colorful text editor but a robust debugger, intelligent prompting of defined objects and functions, a report builder to create business reports quickly and some source code management functionality that gives the team the ability to check code out, check in and do difference and code merge operations.

6.  To ensure success make sure the Linux version of the application is familiar to the end user. The application should behave just like the Windows version. This is important for end users to buy in to the new Linux application. Application familiarity will maintain end-user productivity because they won't have to learn a new application. Remember, in most cases the end user does not care what operating system they are using, but they do care if the application behaves differently than their existing application - especially if it's more complex or requires additional keystrokes.

7.  Watch out for products with dual licenses. You must read the license agreement of all development tools, report writers, debugging aids or databases and make sure you understand the license requirements. Some products don't require a license fee if YOUR product is open sourced, but others don't make this distinction. Some require that you buy a license to use it for commercial purposes, others do not. This is a very important point you must investigate before choosing a development tool, programming aid or database.

8.  Make sure the products you choose provide some type of technical support. Choose products that have installation support and an active news group to get your questions answered. If available, it would also be smart to buy a support contract for the first year to ensure your project does not get sidetracked on a technical issue that takes up valuable time and effort. Tech support contracts pay for themselves with just a couple of calls.

9.  Once your first project is up and running, monitor it and determine your return on investment and I'm positive you will find it to be substantial and that it will pave the way for future projects because management will clearly see the advantages.

10.  Finally, remember that not all applications are candidates for Linux migration so make sure you involve the end user in your investigation and thoroughly analyze the application before embarking on conversion.

In closing, remember to take a small application or subset of a larger application as your first project. Add an additional 30% to your estimate to account for the learning curve and most of all, keep track of the challenges you encounter and how you solved them so your next project can benefit from your learning experience.

More Stories By Charles W. Stevenson

Charles W. Stevenson, PhD, is
CTO of GUPTA Technologies, LLC.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
Today’s enterprise is being driven by disruptive competitive and human capital requirements to provide enterprise application access through not only desktops, but also mobile devices. To retrofit existing programs across all these devices using traditional programming methods is very costly and time consuming – often prohibitively so. In his session at @ThingsExpo, Jesse Shiah, CEO, President, and Co-Founder of AgilePoint Inc., discussed how you can create applications that run on all mobile devices as well as laptops and desktops using a visual drag-and-drop application – and eForms-buildi...
The 3rd International Internet of @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that its Call for Papers is now open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
Enthusiasm for the Internet of Things has reached an all-time high. In 2013 alone, venture capitalists spent more than $1 billion dollars investing in the IoT space. With "smart" appliances and devices, IoT covers wearable smart devices, cloud services to hardware companies. Nest, a Google company, detects temperatures inside homes and automatically adjusts it by tracking its user's habit. These technologies are quickly developing and with it come challenges such as bridging infrastructure gaps, abiding by privacy concerns and making the concept a reality. These challenges can't be addressed w...
The Domain Name Service (DNS) is one of the most important components in networking infrastructure, enabling users and services to access applications by translating URLs (names) into IP addresses (numbers). Because every icon and URL and all embedded content on a website requires a DNS lookup loading complex sites necessitates hundreds of DNS queries. In addition, as more internet-enabled ‘Things' get connected, people will rely on DNS to name and find their fridges, toasters and toilets. According to a recent IDG Research Services Survey this rate of traffic will only grow. What's driving t...
The Internet of Things is a misnomer. That implies that everything is on the Internet, and that simply should not be - especially for things that are blurring the line between medical devices that stimulate like a pacemaker and quantified self-sensors like a pedometer or pulse tracker. The mesh of things that we manage must be segmented into zones of trust for sensing data, transmitting data, receiving command and control administrative changes, and peer-to-peer mesh messaging. In his session at @ThingsExpo, Ryan Bagnulo, Solution Architect / Software Engineer at SOA Software, focused on desi...
"For over 25 years we have been working with a lot of enterprise customers and we have seen how companies create applications. And now that we have moved to cloud computing, mobile, social and the Internet of Things, we see that the market needs a new way of creating applications," stated Jesse Shiah, CEO, President and Co-Founder of AgilePoint Inc., in this SYS-CON.tv interview at 15th Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
The 3rd International Internet of @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that its Call for Papers is now open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
The Industrial Internet revolution is now underway, enabled by connected machines and billions of devices that communicate and collaborate. The massive amounts of Big Data requiring real-time analysis is flooding legacy IT systems and giving way to cloud environments that can handle the unpredictable workloads. Yet many barriers remain until we can fully realize the opportunities and benefits from the convergence of machines and devices with Big Data and the cloud, including interoperability, data security and privacy.
Things are being built upon cloud foundations to transform organizations. This CEO Power Panel at 15th Cloud Expo, moderated by Roger Strukhoff, Cloud Expo and @ThingsExpo conference chair, addressed the big issues involving these technologies and, more important, the results they will achieve. Rodney Rogers, chairman and CEO of Virtustream; Brendan O'Brien, co-founder of Aria Systems, Bart Copeland, president and CEO of ActiveState Software; Jim Cowie, chief scientist at Dyn; Dave Wagstaff, VP and chief architect at BSQUARE Corporation; Seth Proctor, CTO of NuoDB, Inc.; and Andris Gailitis, C...
Since 2008 and for the first time in history, more than half of humans live in urban areas, urging cities to become “smart.” Today, cities can leverage the wide availability of smartphones combined with new technologies such as Beacons or NFC to connect their urban furniture and environment to create citizen-first services that improve transportation, way-finding and information delivery. In her session at @ThingsExpo, Laetitia Gazel-Anthoine, CEO of Connecthings, will focus on successful use cases.
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focused on understanding how industrial data can create intelligence for industrial operations. Imagine ...
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, discussed single-value, geo-spatial, and log time series data. By focusing on enterprise applications and the data center, he will use OpenTSDB as an example t...
Cultural, regulatory, environmental, political and economic (CREPE) conditions over the past decade are creating cross-industry solution spaces that require processes and technologies from both the Internet of Things (IoT), and Data Management and Analytics (DMA). These solution spaces are evolving into Sensor Analytics Ecosystems (SAE) that represent significant new opportunities for organizations of all types. Public Utilities throughout the world, providing electricity, natural gas and water, are pursuing SmartGrid initiatives that represent one of the more mature examples of SAE. We have s...
SYS-CON Media announced that Splunk, a provider of the leading software platform for real-time Operational Intelligence, has launched an ad campaign on Big Data Journal. Splunk software and cloud services enable organizations to search, monitor, analyze and visualize machine-generated big data coming from websites, applications, servers, networks, sensors and mobile devices. The ads focus on delivering ROI - how improved uptime delivered $6M in annual ROI, improving customer operations by mining large volumes of unstructured data, and how data tracking delivers uptime when it matters most.
The true value of the Internet of Things (IoT) lies not just in the data, but through the services that protect the data, perform the analysis and present findings in a usable way. With many IoT elements rooted in traditional IT components, Big Data and IoT isn’t just a play for enterprise. In fact, the IoT presents SMBs with the prospect of launching entirely new activities and exploring innovative areas. CompTIA research identifies several areas where IoT is expected to have the greatest impact.
There is no doubt that Big Data is here and getting bigger every day. Building a Big Data infrastructure today is no easy task. There are an enormous number of choices for database engines and technologies. To make things even more challenging, requirements are getting more sophisticated, and the standard paradigm of supporting historical analytics queries is often just one facet of what is needed. As Big Data growth continues, organizations are demanding real-time access to data, allowing immediate and actionable interpretation of events as they happen. Another aspect concerns how to deliver ...
The Internet of Things will greatly expand the opportunities for data collection and new business models driven off of that data. In her session at @ThingsExpo, Esmeralda Swartz, CMO of MetraTech, discussed how for this to be effective you not only need to have infrastructure and operational models capable of utilizing this new phenomenon, but increasingly service providers will need to convince a skeptical public to participate. Get ready to show them the money!
Scott Jenson leads a project called The Physical Web within the Chrome team at Google. Project members are working to take the scalability and openness of the web and use it to talk to the exponentially exploding range of smart devices. Nearly every company today working on the IoT comes up with the same basic solution: use my server and you'll be fine. But if we really believe there will be trillions of these devices, that just can't scale. We need a system that is open a scalable and by using the URL as a basic building block, we open this up and get the same resilience that the web enjoys.
In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect at GE, and Ibrahim Gokcen, who leads GE's advanced IoT analytics, focused on the Internet of Things / Industrial Internet and how to make it operational for business end-users. Learn about the challenges posed by machine and sensor data and how to marry it with enterprise data. They also discussed the tips and tricks to provide the Industrial Internet as an end-user consumable service using Big Data Analytics and Industrial Cloud.
How do APIs and IoT relate? The answer is not as simple as merely adding an API on top of a dumb device, but rather about understanding the architectural patterns for implementing an IoT fabric. There are typically two or three trends: Exposing the device to a management framework Exposing that management framework to a business centric logic Exposing that business layer and data to end users. This last trend is the IoT stack, which involves a new shift in the separation of what stuff happens, where data lives and where the interface lies. For instance, it's a mix of architectural styles ...