Welcome!

Linux Containers Authors: Liz McMillan, Yeshim Deniz, Elizabeth White, Pat Romanski, Zakia Bouachraoui

Related Topics: Linux Containers

Linux Containers: Article

Open Source: Changing the Enterprise Software Supply Chain for Good

OSS Development Model Examined

The open source software development model clearly represents a profound and fundamental change from traditional, proprietary development models.

In the proprietary world, a software company invests massive dollars in development, sales, and marketing. They recoup their investment in license fees, which have strict provisions regarding redistribution of the product (you can't), accessing the source code (are you kidding?), and making changes to the source code (call your lawyer). Open Source Software (OSS) turns each of these three provisions on their head: distribute it, look at it, customize it. In so doing, OSS offers a vastly different value proposition to business users, enabling application and stack customization for every size company - a benefit that, in the proprietary software world, is typically only available to large customers with deep pockets - and freeing businesses from having to count license seats to stay in compliance. In addition to these new business benefits that open source brings, the emergence of this new way to develop and use software also portends a vastly different distribution model.

The Way We Were
In the proprietary model, the value chain looks similar to what is shown in Figure 1. The vendor develops and markets a product, which they then supply through distributors (such as Ingram, GE Access, or Tech Data) who in turn supply, and, to various degrees, educate, finance, and support the Value Added Resellers (VARs) who ultimately provide the product to the customer. This is the way most technology is sold, with established vendors frequently selling 70% or more of their product through the channel. In this model, software vendors traditionally keep the lion's share of the gross profit on each unit sold; distributors usually take the lowest percent, which they make up for in volume; and the VAR usually walks away with anywhere from 15-30% gross.

This distribution of margins across the supply chain for traditional shrink-wrapped software reflected the risks and costs incurred by the different players. Because proprietary software vendors usually start from scratch, and because the development model they choose prevents them from sharing the work, customers pay a price premium. Take the example of three operating systems: MacOS, Windows, and Linux. All three operating systems are comprised of similar elements, such as a graphical user interface and kernel. In the Windows model, all of it is proprietary.

Microsoft develops their kernel and graphical user interface all on their own, expending a large amount of resources and charging prices that reflect their R&D and marketing costs and the risk associated with winning in the marketplace. This is the quintessential proprietary model.

Second, there is Mac OS X, which is developed using the work of FreeBSD as a base operating system but then they focus on their GUI and other accoutrement, which make their operating system one of the most useful and well designed.

Finally, there is Linux, where communities develop around software projects and then companies like Novell and Red Hat focus on value-added services like testing, certification, support, and distribution. The R&D risk borne by these companies is less because it is shared by a community of developers and a whole industry of vendors.

Recently, commercial software has served as the anchor product in may VARs' offerings, allowing them to upsell services. And so, it's understandable why the emergence of OSS might be greeted with skepticism by some in the channel. After all, OSS can be perceived as a threat to their bread-and-butter enterprise software business, which some analysts now think is in the midst of a trend toward declining prices. First hardware margins evaporated, now proprietary software prices seem to be heading south. What's a VAR to do? Embracing open source may be one good remedy, but take heed - there are key differences that VARs need to be aware of as they enter the OSS world - differences that can help them make more money than before or that, if ignored, can leave money on the table to be picked up by a competitor. There are two key differences, it seems to me, between OSS distribution and the existing distribution model. These are the "one size fits all" issue and the changing role of the tier two distributors.

One Size Fits All
One of the key benefits of OSS is that users have the option to customize the product to meet their business's exact needs. Whether that means integrating the system with other legacy solutions, or adding an additional feature or capability in order to make the system more useful, the ability to customize the software gives users this freedom.

Proprietary Software Feedback Loop: Effective? Sometimes. Fast? No.
Since proprietary software vendors maintain tight control over their source code, any changes to their products must be carefully and centrally orchestrated by the vendor. In this model, customer feedback on the product flows in the opposite direction as sales: the customer tells their VAR, who is typically the customer's trusted technology adviser, what they think of the product. VARs act as sort of collection points for customer comments, passing them upstream to the vendor, either through their distributor or, frequently, at the vendor's partner event (see Figure 2). This feedback mechanism tends to be slow because it's indirect.

Also, the feedback loop to the vendor is only effective sometimes because vendors are businesses, and, as such, they focus their energy on solving the problems that, in their judgment, will have the broadest appeal to the largest number of users. If the specific thing you want their product to do is either not widely shared among other users or if the amount of money your company spends on the vendor's product does not represent a substantial percentage of their total revenue (that is, if you're not really big), then your comments are not likely to be acted upon quickly. In an interview for this article, SugarCRM cofounder and CEO John Roberts observed that, in the traditional proprietary model, resellers are largely shut out of playing an active role in this feedback process. Sure, they can provide feedback through such vehicles as user groups and partner conferences, but this does not offer the agility nor the flexibility to respond to varied and quickly changing customer requirements. An example of something that a proprietary vendor may be slow to implement, if at all, is translating their product into a "small" language, let's say, Bulgarian. Though potentially very important to the speakers of the language, the expected net new revenue to the vendor may not justify the cost of implementing, and then maintaining over time, the product in that language.

Open Source Software Feedback Loop: Effective? Eminently So. Fast? Wicked.
The open source development model fundamentally transforms the relationship between vendor and channel (see Figure 3). Where the proprietary software distribution model is hierarchical and primarily uni-directional, with open source, channels are empowered to not just distribute the product and take their cut, but to extend and customize the product to meet their clients' specific needs. When feedback is received by the open source VAR/developer, they are empowered to act on it immediately and in such a way as to maximize the utility of the product to their customer base. These extensions can then be contributed back into the product through the development community that surrounds it. In this way, when managed properly, open source software products can develop additional functionality that satisfies real customer needs far more rapidly than traditional proprietary software products can.

The Role of Distributors
The distribution model for shrink-wrapped software was largely adopted from the hardware distribution model that emerged in the 1980s. Distributors grew out of the need to more efficiently supply hardware to market. Hardware vendors wanted a better way to forecast demand, and distributors were able to provide these better forecasts since they touched lots of VARs. For their part, VARs didn't want to have to maintain relationships with multiple hardware vendors. Thus the need had emerged for the classic middleman, and the distributor was born.

When hardware prices took a nose dive in the 1990s, many VARs turned to software to pump up the bottom line, and the natural choice was to buy this software from their existing distributor sources. Faced with declining hardware margins, VARs also moved more aggressively into offering total solution integration, and set-up and support services, once the exclusive domain of the large vendors' professional services organizations. As VARs shifted their business to include software, distributors fell easily into the new role as supplier of choice for several reasons. First, much enterprise software has been sold preloaded on hardware, and therefore this required only nominal changes to the overall supply chain flow. Second, leading distributors achieved excellence in logistics, which translated well to the need to distribute and keep track of multiple versions of multiple software products. Third, since the distribution of proprietary software and hardware must be controlled to prevent gray market activities, distributors play an important role here because they can easily "turn off" a suspect VAR. Finally, VARs frequently had established financing relationships with their distributors, which made buying from them easier still.

The Internet as a distribution mechanism in combination with the open source model empowers vendors and VARs to cut out the middleman, as it were, at least for the distributor's traditional value-add in the shrink-wrapped software supply chain. With OSS, too, the black and gray market problem is dramatically reduced, if not eliminated, by the very nature of the open source license agreement - share it, look at it, add to it.

However, just because open source reduces the importance of the role traditionally played by distributors doesn't mean that there's not a crucially important role that distributors are very well placed to fill in the new open source model. I am referring to the need to provide resellers and, thus, mid-size customers, with OSS stack and mixed OSS/proprietary stack testing, certification, and backing. Who better to provide the critical assurance that this version of this open source application will work with that application server running on that Linux distro with this database? By the way, Mr. Open Source VAR/developer, sign up with me and I'll provide testing and certification for your customized versions/extensions of this application running on that distro with that app server, etc., etc. A Tech Data, Arrow, or Avnet has the labs, the expertise, the name, and the reach to eliminate this key hurdle to wider open source adoption among VARs and mid-market companies. In so doing, they could reassert their relevance in the shifting software marketplace and deliver critically needed assurance to VARs interested in taking on OSS and, by extension, provide this same assurance to the VAR's mid-market customers.

Next Steps
I'd love to see one of the large distributors announce an open source center of excellence, or, better yet, a mixed source center of excellence, perhaps in partnership with established open source players and emerging companies like SpikeSource and Black Duck. VARs and mid-market businesses need help validating best-of-breed mixed stacks, testing them out in different deployment environments, and providing the much-needed support backstop for increasingly varied and complex combinations. Distributors, it seems to me, are uniquely well positioned to offer this critical service. By so doing, they can accelerate the mid-market adoption of open source solutions.

More Stories By Greg Wallace

Greg Wallace is Co-Founder and Chief Marketing Officer of Emu Software, Inc. Greg received his MBA and Masters of International Affairs degrees from Columbia University in New York City. He also spent a year as a Rotary Foundation Scholar at the University of Barcelona, Spain. He can be reached at [email protected]

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


IoT & Smart Cities Stories
In today's enterprise, digital transformation represents organizational change even more so than technology change, as customer preferences and behavior drive end-to-end transformation across lines of business as well as IT. To capitalize on the ubiquitous disruption driving this transformation, companies must be able to innovate at an increasingly rapid pace.
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
"MobiDev is a Ukraine-based software development company. We do mobile development, and we're specialists in that. But we do full stack software development for entrepreneurs, for emerging companies, and for enterprise ventures," explained Alan Winters, U.S. Head of Business Development at MobiDev, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science," is responsible for setting the strategy and defining the Big Data service offerings and capabilities for EMC Global Services Big Data Practice. As the CTO for the Big Data Practice, he is responsible for working with organizations to help them identify where and how to start their big data journeys. He's written several white papers, is an avid blogge...
When talking IoT we often focus on the devices, the sensors, the hardware itself. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things'). When we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing. IoT is not about the devices, its about the data consumed and generated. The devices are tools, mechanisms, conduits. This paper discusses the considerations when dealing with the...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
With 10 simultaneous tracks, keynotes, general sessions and targeted breakout classes, @CloudEXPO and DXWorldEXPO are two of the most important technology events of the year. Since its launch over eight years ago, @CloudEXPO and DXWorldEXPO have presented a rock star faculty as well as showcased hundreds of sponsors and exhibitors! In this blog post, we provide 7 tips on how, as part of our world-class faculty, you can deliver one of the most popular sessions at our events. But before reading...