Click here to close now.

Welcome!

Linux Authors: Plutora Blog, XebiaLabs Blog, Mike Kavis, Pat Romanski, Carmen Gonzalez

Related Topics: Linux

Linux: Article

Is Linux Enterprise Ready?

Making the move

No doubt this topic has been debated to death; however, as I have a different perspective on this issue, I reckon it's worth writing down.

Over the past few weeks I've been involved with one of our local customers who, after a lot of consideration, has decided to make the jump to Linux. This was no quick decision, mind you, and was more than a "I'm tired of paying Microsoft for licenses" thing.

Why the Move?
Linux made its way into the organization when I chose to use it as a desktop system while I was still consulting for the customer as a DBA and J2EE developer. (Yeah, I know, a weird combo, but I've never liked scripting languages that much, so I chose Java/J2EE for my DBA tools.) I got pretty uptight when Eclipse and Windows (the company standard) decided to crash or hang on me every few hours, and I moved back to Linux.

As a DBA, you tend to get involved with all sorts of issues, mainly because in the case of a reasonably large or busy application, the database is normally the first thing that takes the blame in the case of a performance dip. On one of my investigations, I found that the database had trouble sending data back to the client applications (network waits). The networking guys were just laughing at me and said that the database shouldn't send so much data back. (!?!)

There were some variables involved: all of the servers (application, Web, database, mail) were hosted at a different site, and all traffic (including Web traffic) was being routed to (through) the offsite location (a local ISP). My theory was that some people were misusing the Web, as my investigation pointed out that HTTP traffic was extremely high.

This was really the first case in which I could implement Linux with a direct business benefit. After numerous consultations with the client (a Windows-only type), I decided to take an older PC that was sitting around in the storeroom, slap some SCSI drives into it, and install Mandrake Linux on it. My reasoning here was that Mandrake is a pretty friendly O/S for a Windows-skilled "LANnie" to pick up. I then went for Squid proxy and installed a Web reporting tool (squint) onto the "proxy server," as it was called. This allowed us to report, per user, the amount of time spent on Web sites, the amount of data downloaded, site details, etc. We could basically pinpoint exactly who was surfing, for how long, and what sites they were viewing. We had to change some of the client browser settings to point to the proxy (you change firewall rules to allow only HTTP traffic from the proxy server, and point all client browsers to the proxy).

We gathered statistics for two or three days, and our first report proved that my hunch was correct. Some guy in the admin department was using up a lot of our much-needed bandwidth by downloading, well, porn; some other people were using the Web for audio streaming, and others were downloading MP3s and games, etc. Now, in South Africa, bandwidth is expensive and slow. We only have one provider of leased or other telco lines (changing in 2006), and 3G isn't what it should be (yet).

We blocked some sites; the client issued some final warnings; and, by the next day, the system was flying again. I started using our "proxy server" for more things, to see how much we could get out of a simple PC (about 128MB RAM, 40GB disk space, 1 GHz Pentium, 3 CPU). We implemented CVS, an open source version control tool. We gave users in the operations department a home directory to back up documents. We set up some print queues.

The CIO was pretty happy with what we managed to squeeze out of the PC. The key thing to realize here is that Linux could significantly benefit the business by doing small things very well, at a low cost. The question was: Could it take over critical operations in the enterprise system?

To me, the best place for Linux today is with the most "invisible" part of the business: the data. A database should do one thing very well: store data and provide easy and efficient access to it. It doesn't need fancy GUIs. It doesn't need wizards, graphs, reporting, and other things associated with client applications. The database is a storage engine (with a few twists). Linux on the desktop hasn't been successful so far for many reasons, which I'll address in my next article. But for database, application, Web, and mail servers? If configured correctly (on any operating system), they tend to run in lights-off mode most of the time, or they should.

One of the issues in the environment was that you had to reboot the Windows servers pretty regular, especially the database server. The database engine uses a lot of resources and was pushing the box to the limit. I felt that a Linux O/S would be a better database server than Windows could be; you have more flexibility in tuning Linux, and I perceive the Linux O/S to be more stable than Windows, after years of working with both environments, especially for a RDBMS.

While we were contemplating the shift, the Windows O/S did its best to help us make a decision. One night, I received a call at 3 a.m. from the network admin and was told that they couldn't boot any of their servers. A virus had managed to corrupt the ntoskernel.dll file (or something like that), and the O/S had to be recovered. (At least backups were complete....) Something went wrong on the recovery. By the time I arrived on site, I was told that the O/S had to be trashed and we would have to revert to backup. We lost about four days, due to wait time for hardware and O/S configuration. After that, the writing was on the wall - we were going Linux, wherever we could. As a matter of fact, we already had two Linux servers in the rack: our integration server and a server that was responsible for client communications (generated PDF documents and mailed it out).

Even before this happened I presented a greater Linux strategy to the customer. Here is a high level:

1.  Move the database servers to Linux.
This is the lowest risk, because the users aren't affected at all, except maybe we expected more uptime and better scalability. In effect, we didn't anticipate too much of a performance boost - moving to Linux on the same 32-bit hardware wouldn't make too much of an outright performance change, but we were expecting a small improvement.

2.  Move the Web server (IIS) to Apache or Tomcat.
Most Web servers in the world run Apache, and it gets rid of having to pay licenses for a commodity. Another thing to mention is that the customer's enterprise application runs a J2EE Webapp, and it was felt that we should standardize the corporate Website to something like JSP, which could be supported by more than one person and can run on multiple environments.

3.  Move the application server to Linux.
This should've been easy, but it wasn't. The early application developers used the PowerBuilder DataWindow in their J2EE app, and we weren't convinced that the move would be seamless. So we left this until last.

4.  Convert all remaining client/server apps to thin client, browser-based apps.
A browser-based app would mean that the end users could use any OS and browser they felt comfortable with. Also, it puts the business in a position to test out Desktop Linux, and do this at their own pace. Why would they want to? The most significant savings to be made out of a corporate Linux shift is at the desktop level for application users. Power users may still want to run Windows, but for the person who comes to work in the morning, switches on his PC, and fires up his e-mail client and the application he requires to do his work, he could use any operating system - Mac, Linux, Windows, Solaris.

Even better, you probably don't need the "enterprise" version of Linux at the desktop level, meaning that the O/S won't cost you a cent. Now, calculate this for an organization with 500 users. And remember to add up Office and any other Windows license, etc.

5.  Desktop Linux, where it makes sense.
More of the above. There are some good articles on the Web from various authors who point out that most Windows fans are really Office fans. Microsoft Outlook is the de facto standard for organizations because of the integrated collaboration. However, the largest percentage of employees in a standard-sized organization probably use about 15% of Office. It makes sense for these users to try out OpenOffice. The tactic here was to install OpenOffice on Windows, swap the mail client to something like Thunderbird, and do proper UAT to see how that goes.

6.  Mail servers.
Depending on the business and how the organization uses the Outlook Calendaring (if they use Outlook at all), this could be an easy or difficult shift. In this case, about 30% of the users in the organization uses Outlook with calendaring, so it's not practical yet. How do we do this? In this case, it doesn't really matter. Windows and Linux can co-exist pretty easily in the environment, and I would never advocate a "rip and replace" strategy. The best strategy we can think of now is to go for a CRM (the client needs and wants to implement CRM) that integrates collaboration. First choices for now: SugarCRM and possibly Compiere.

The customer was ready for phases one, two, and three. When we started strategizing the Linux shift, an interesting question came up, and it's one that comes up quite a lot now: While we're doing this move, how about investigating 64-bit architecture? Surely this will also make a massive difference? Our initial test showed that we would get a 10-15% performance increase by using our same hardware, but that's fairly insignificant. Sooner or later we would run into hardware limitations. The Linux shift would extend the use of the current hardware to about eight months, and this seemed to be a short-sighted strategy.

The customer asked what was needed for a "significant" performance improvement at the database level, and how can we ensure that our hardware lasts us for the next five years? The key thing about a database is that it's only as fast as the amount of I/O requests it can process. Generally, disk writes and reads are very expensive and slow I/O operations. To offset this, you throw RAM at the problem and increase the database cache so that it doesn't have to do as many direct disk reads and writes. There's a lot more to this, but that's the basic rule. This is especially true if you are sure that the database engine has been properly configured to use the machine resources efficiently, and that all of the queries thrown at the server are optimized.

More Stories By Rudi Leibbrandt

Rudi Leibbrandt works at Sybase South Africa managing the Sybase development toolset.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect at GE, and Ibrahim Gokcen, who leads GE's advanced IoT analytics, focused on the Internet of Things / Industrial Internet and how to make it operational for business end-users. Learn about the challenges posed by machine and sensor data and how to marry it with enterprise data. They also discussed the tips and tricks to provide the Industrial Internet as an end-user consumable service using Big Data Analytics and Industrial Cloud.
Performance is the intersection of power, agility, control, and choice. If you value performance, and more specifically consistent performance, you need to look beyond simple virtualized compute. Many factors need to be considered to create a truly performant environment. In his General Session at 15th Cloud Expo, Harold Hannon, Sr. Software Architect at SoftLayer, discussed how to take advantage of a multitude of compute options and platform features to make cloud the cornerstone of your online presence.
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @ThingsExpo, Michael Sick, a Senior Manager and Big Data Architect within Ernst and Young's Financial Servi...
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along with a steady stream of well-publicized data breaches, only add to the uncertainty
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @ThingsExpo, Michael Sick, a Senior Manager and Big Data Architect within Ernst and Young's Financial Servi...
Docker is an excellent platform for organizations interested in running microservices. It offers portability and consistency between development and production environments, quick provisioning times, and a simple way to isolate services. In his session at DevOps Summit at 16th Cloud Expo, Shannon Williams, co-founder of Rancher Labs, will walk through these and other benefits of using Docker to run microservices, and provide an overview of RancherOS, a minimalist distribution of Linux designed expressly to run Docker. He will also discuss Rancher, an orchestration and service discovery platf...
PubNub on Monday has announced that it is partnering with IBM to bring its sophisticated real-time data streaming and messaging capabilities to Bluemix, IBM’s cloud development platform. “Today’s app and connected devices require an always-on connection, but building a secure, scalable solution from the ground up is time consuming, resource intensive, and error-prone,” said Todd Greene, CEO of PubNub. “PubNub enables web, mobile and IoT developers building apps on IBM Bluemix to quickly add scalable realtime functionality with minimal effort and cost.”
Sensor-enabled things are becoming more commonplace, precursors to a larger and more complex framework that most consider the ultimate promise of the IoT: things connecting, interacting, sharing, storing, and over time perhaps learning and predicting based on habits, behaviors, location, preferences, purchases and more. In his session at @ThingsExpo, Tom Wesselman, Director of Communications Ecosystem Architecture at Plantronics, will examine the still nascent IoT as it is coalescing, including what it is today, what it might ultimately be, the role of wearable tech, and technology gaps stil...
Every innovation or invention was originally a daydream. You like to imagine a “what-if” scenario. And with all the attention being paid to the so-called Internet of Things (IoT) you don’t have to stretch the imagination too much to see how this may impact commercial and homeowners insurance. We’re beyond the point of accepting this as a leap of faith. The groundwork is laid. Now it’s just a matter of time. We can thank the inventors of smart thermostats for developing a practical business application that everyone can relate to. Gone are the salad days of smart home apps, the early chalkb...
CommVault has announced that top industry technology visionaries have joined its leadership team. The addition of leaders from companies such as Oracle, SAP, Microsoft, Cisco, PwC and EMC signals the continuation of CommVault Next, the company's business transformation for sales, go-to-market strategies, pricing and packaging and technology innovation. The company also announced that it had realigned its structure to create business units to more directly match how customers evaluate, deploy, operate, and purchase technology.
In the consumer IoT, everything is new, and the IT world of bits and bytes holds sway. But industrial and commercial realms encompass operational technology (OT) that has been around for 25 or 50 years. This grittier, pre-IP, more hands-on world has much to gain from Industrial IoT (IIoT) applications and principles. But adding sensors and wireless connectivity won’t work in environments that demand unwavering reliability and performance. In his session at @ThingsExpo, Ron Sege, CEO of Echelon, will discuss how as enterprise IT embraces other IoT-related technology trends, enterprises with i...
When it comes to the Internet of Things, hooking up will get you only so far. If you want customers to commit, you need to go beyond simply connecting products. You need to use the devices themselves to transform how you engage with every customer and how you manage the entire product lifecycle. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, will show how “product relationship management” can help you leverage your connected devices and the data they generate about customer usage and product performance to deliver extremely compelling and reliabl...
The Internet of Things (IoT) is causing data centers to become radically decentralized and atomized within a new paradigm known as “fog computing.” To support IoT applications, such as connected cars and smart grids, data centers' core functions will be decentralized out to the network's edges and endpoints (aka “fogs”). As this trend takes hold, Big Data analytics platforms will focus on high-volume log analysis (aka “logs”) and rely heavily on cognitive-computing algorithms (aka “cogs”) to make sense of it all.
With several hundred implementations of IoT-enabled solutions in the past 12 months alone, this session will focus on experience over the art of the possible. Many can only imagine the most advanced telematics platform ever deployed, supporting millions of customers, producing tens of thousands events or GBs per trip, and hundreds of TBs per month. With the ability to support a billion sensor events per second, over 30PB of warm data for analytics, and hundreds of PBs for an data analytics archive, in his session at @ThingsExpo, Jim Kaskade, Vice President and General Manager, Big Data & Ana...
One of the biggest impacts of the Internet of Things is and will continue to be on data; specifically data volume, management and usage. Companies are scrambling to adapt to this new and unpredictable data reality with legacy infrastructure that cannot handle the speed and volume of data. In his session at @ThingsExpo, Don DeLoach, CEO and president of Infobright, will discuss how companies need to rethink their data infrastructure to participate in the IoT, including: Data storage: Understanding the kinds of data: structured, unstructured, big/small? Analytics: What kinds and how responsiv...
Since 2008 and for the first time in history, more than half of humans live in urban areas, urging cities to become “smart.” Today, cities can leverage the wide availability of smartphones combined with new technologies such as Beacons or NFC to connect their urban furniture and environment to create citizen-first services that improve transportation, way-finding and information delivery. In her session at @ThingsExpo, Laetitia Gazel-Anthoine, CEO of Connecthings, will focus on successful use cases.
The Workspace-as-a-Service (WaaS) market will grow to $6.4B by 2018. In his session at 16th Cloud Expo, Seth Bostock, CEO of IndependenceIT, will begin by walking the audience through the evolution of Workspace as-a-Service, where it is now vs. where it going. To look beyond the desktop we must understand exactly what WaaS is, who the users are, and where it is going in the future. IT departments, ISVs and service providers must look to workflow and automation capabilities to adapt to growing demand and the rapidly changing workspace model.
Sensor-enabled things are becoming more commonplace, precursors to a larger and more complex framework that most consider the ultimate promise of the IoT: things connecting, interacting, sharing, storing, and over time perhaps learning and predicting based on habits, behaviors, location, preferences, purchases and more. In his session at @ThingsExpo, Tom Wesselman, Director of Communications Ecosystem Architecture at Plantronics, will examine the still nascent IoT as it is coalescing, including what it is today, what it might ultimately be, the role of wearable tech, and technology gaps stil...
Almost everyone sees the potential of Internet of Things but how can businesses truly unlock that potential. The key will be in the ability to discover business insight in the midst of an ocean of Big Data generated from billions of embedded devices via Systems of Discover. Businesses will also need to ensure that they can sustain that insight by leveraging the cloud for global reach, scale and elasticity.
The Internet of Things (IoT) promises to evolve the way the world does business; however, understanding how to apply it to your company can be a mystery. Most people struggle with understanding the potential business uses or tend to get caught up in the technology, resulting in solutions that fail to meet even minimum business goals. In his session at @ThingsExpo, Jesse Shiah, CEO / President / Co-Founder of AgilePoint Inc., showed what is needed to leverage the IoT to transform your business. He discussed opportunities and challenges ahead for the IoT from a market and technical point of vie...