Welcome!

Linux Containers Authors: Kevin Benedict, Liz McMillan, Elizabeth White, Craig Lowell, AppNeta Blog

Related Topics: Open Source Cloud, Linux Containers

Open Source Cloud: Article

Strengthening Open Source's Weakest Link: Software Testing

Broader, more collaborative open testing can yield more meaningful results for business and developers alike

Murugan Pal (pictured), founder and CTO of SpikeSource, writes: Pop quiz: If one open source user tests 30 percent of an application, and another tests 20 percent, how much of the application has been tested?

The answer is probably closer to 30 percent than 50 percent, since both users probably focused on common functions like start-up, shutdown, and data access. The problem gets amplified if the application is built for n-tier deployment based on service-oriented architecture. newspapers and broadcasters. The service handles between 150,000 and 500,000 pages of content per affiliate per day, supporting 11,000 concurrent users. MySQL, a free open source database, has been the backbone of AP Hosted News since 2002.

Everyone knows that the cornerstone of open source software is the free availability of its source code, which lets developers and users around the world contribute to it and improve it. The software naturally becomes stronger as it accumulates improvements and sheds imperfections. The quality improves based on more usage and reviews.

But the model breaks down when it comes to making sure the software actually works in real-world deployment scenarios. The power of participation has been confined almost entirely to the development phase of the software life cycle. Testing remains open source's weakest link as it is difficult to reproduce all intended usages.

While code repositories and other shared resources help developers revise and build upon the efforts of their peers, the testing of that software has remained an uncoordinated, isolated affair. Instead of learning from and enhancing each other's tests, users and developers test the same functions and routines, and have no way to easily share their results with each other.

Because most testers are testing only on the platform they happen to be using, most test results aren't widely applicable. Results that show how well a piece of software works on a particular platform might not say much about how well it works with a different operating system, or how well it interacts with other software components. That's a major shortcoming, since open source software components are mostly used as part of a stack with other components.

A Moving Target
The constantly changing nature of many open source programs makes meaningful results even harder to come by. To get results that would be accurate and meaningful to a broad section of the open source community, a user would have to constantly retest it on an ever-growing number of platforms (which are also changing).

As a result, some of the biggest challenges of open source software have remained intact. "Dependency hell," (Jar Wars and DLL Hell) in which each piece of software relies on a specific version of another piece of software, continues to be a constant time drain for many IT departments. Not only are the dependencies difficult to resolve, some times you end up with redundant footprints of the same libraries embedded in the integrated runtime (e.g., Log4J in Tomcat, Struts, etc.).

What if the open source development model - the "architecture of participation," to use Tim O'Reilly's phrase - could be extended to software testing? If users could easily access, build upon, and contribute to a growing body of open source tests, testing could become an extension of the participatory development process. Fixes could be validated faster, and functionality and backward compatibility across different versions of integrated software components would be easier. Even enterprise customers can participate in this model by validating their tests on integrated hardware and software runtime environments.

Participatory testing would also help certify the interoperability of the exponentially increasing combinations of component choices. Most businesses use open source software not in isolation, but in stacks of interoperating components. Tests should be able to tell you exactly how well those components work together.

Just as the participatory development community relies on open resources and information repositories for source code, the participatory testing community should have open resources for testing -including open tests, test manifests, test results, and interfaces/protocols to share test results and tests.

That's where SpikeSource want to be a catalyst in promoting participatory testing. We provide all of those resources, as well as an environment for open source software users and developers to share, obtain, and exchange federated information about open source testing.

Businesses can upload an application to SpikeSource's open testing tool, which continuously pulls code from open source repositories and builds its own repository of different versions of different components and operating systems. SpikeSource automatically constructs systems out of those repositories, provisions them on virtualized runtime environments, tests them, and records the results.

Interoperability Is Key
For most businesses, how well a component works with others is just as important as its independent functionality. For example, if there's a change to the Apache servlet engine Tomcat, a test should show how those changes impact other components like JK2, ConnectorJ or MySQL.

The ultimate goal is to make open source software more scalable and predictable, so that business can use it in conjunction with or as an alternative to proprietary software. Our goal at SpikeSource is to help further the enterprise adoption of open source software. Through testing, it becomes more reliable, easy, and safe for enterprises to deploy.

More Stories By Murugan Pal

Murugan Pal is founder and CTO of SpikeSource.

Comments (3) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Most Recent Comments
Jonathan Bruce's Web Log 09/19/05 01:06:38 PM EDT

Trackback Added: Open source components fall short of the quality metrics?; Sys-Con ran an article on September 16th, by Murugan Pal, CTO of SpikeSource.com Strengthening Open Source's Weakest Link: Software Testing — Pop quiz: If one open source user tests 30 percent of an application, and another tests 20 percent, how...

David Tomlinson 09/12/05 06:38:50 PM EDT

First, the writer tells us that the biggest problem is testing. Then that the problem with testing is that everyone tests the same thing. They he tells us that everybody tests on their own platform, so my test results will mean nothing to you. Then he talks about jar wars and dependency hell. Well, which is it? Is the biggest problem with open source that the writer's company doesn't have enough of our business? Have I ever seen a more obvious commercial for a vendor? Of course. Do I want to see more? Hell, no!

The only reason for writing this article is to get his company's name out there and get a few hits on their website. I gotta admit they got a hit from me, but it's the last one.

Pointless article.

Enterprise Open Source Magazine 09/12/05 09:52:44 AM EDT

Strengthening Open Source's Weakest Link. Pop quiz: If one open source user tests 30 percent of an application, and another tests 20 percent, how much of the application has been tested? The answer is probably closer to 30 percent than 50 percent, since both users probably focused on common functions like start-up, shutdown, and data access.

@ThingsExpo Stories
Connected devices and the industrial internet are growing exponentially every year with Cisco expecting 50 billion devices to be in operation by 2020. In this period of growth, location-based insights are becoming invaluable to many businesses as they adopt new connected technologies. Knowing when and where these devices connect from is critical for a number of scenarios in supply chain management, disaster management, emergency response, M2M, location marketing and more. In his session at @Th...
"Dice has been around for the last 20 years. We have been helping tech professionals find new jobs and career opportunities," explained Manish Dixit, VP of Product and Engineering at Dice, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
What happens when the different parts of a vehicle become smarter than the vehicle itself? As we move toward the era of smart everything, hundreds of entities in a vehicle that communicate with each other, the vehicle and external systems create a need for identity orchestration so that all entities work as a conglomerate. Much like an orchestra without a conductor, without the ability to secure, control, and connect the link between a vehicle’s head unit, devices, and systems and to manage the ...
"We're a cybersecurity firm that specializes in engineering security solutions both at the software and hardware level. Security cannot be an after-the-fact afterthought, which is what it's become," stated Richard Blech, Chief Executive Officer at Secure Channels, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
In addition to all the benefits, IoT is also bringing new kind of customer experience challenges - cars that unlock themselves, thermostats turning houses into saunas and baby video monitors broadcasting over the internet. This list can only increase because while IoT services should be intuitive and simple to use, the delivery ecosystem is a myriad of potential problems as IoT explodes complexity. So finding a performance issue is like finding the proverbial needle in the haystack.
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life sett...
The WebRTC Summit New York, to be held June 6-8, 2017, at the Javits Center in New York City, NY, announces that its Call for Papers is now open. Topics include all aspects of improving IT delivery by eliminating waste through automated business models leveraging cloud technologies. WebRTC Summit is co-located with 20th International Cloud Expo and @ThingsExpo. WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web ...
20th Cloud Expo, taking place June 6-8, 2017, at the Javits Center in New York City, NY, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy.
Internet-of-Things discussions can end up either going down the consumer gadget rabbit hole or focused on the sort of data logging that industrial manufacturers have been doing forever. However, in fact, companies today are already using IoT data both to optimize their operational technology and to improve the experience of customer interactions in novel ways. In his session at @ThingsExpo, Gordon Haff, Red Hat Technology Evangelist, will share examples from a wide range of industries – includin...
WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web communications world. The 6th WebRTC Summit continues our tradition of delivering the latest and greatest presentations within the world of WebRTC. Topics include voice calling, video chat, P2P file sharing, and use cases that have already leveraged the power and convenience of WebRTC.
"We build IoT infrastructure products - when you have to integrate different devices, different systems and cloud you have to build an application to do that but we eliminate the need to build an application. Our products can integrate any device, any system, any cloud regardless of protocol," explained Peter Jung, Chief Product Officer at Pulzze Systems, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at 20th Cloud Expo, Ed Featherston, director/senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
"Once customers get a year into their IoT deployments, they start to realize that they may have been shortsighted in the ways they built out their deployment and the key thing I see a lot of people looking at is - how can I take equipment data, pull it back in an IoT solution and show it in a dashboard," stated Dave McCarthy, Director of Products at Bsquare Corporation, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
IoT is rapidly changing the way enterprises are using data to improve business decision-making. In order to derive business value, organizations must unlock insights from the data gathered and then act on these. In their session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, and Peter Shashkin, Head of Development Department at EastBanc Technologies, discussed how one organization leveraged IoT, cloud technology and data analysis to improve customer experiences and effici...
Fact is, enterprises have significant legacy voice infrastructure that’s costly to replace with pure IP solutions. How can we bring this analog infrastructure into our shiny new cloud applications? There are proven methods to bind both legacy voice applications and traditional PSTN audio into cloud-based applications and services at a carrier scale. Some of the most successful implementations leverage WebRTC, WebSockets, SIP and other open source technologies. In his session at @ThingsExpo, Da...
"IoT is going to be a huge industry with a lot of value for end users, for industries, for consumers, for manufacturers. How can we use cloud to effectively manage IoT applications," stated Ian Khan, Innovation & Marketing Manager at Solgeniakhela, in this SYS-CON.tv interview at @ThingsExpo, held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
As data explodes in quantity, importance and from new sources, the need for managing and protecting data residing across physical, virtual, and cloud environments grow with it. Managing data includes protecting it, indexing and classifying it for true, long-term management, compliance and E-Discovery. Commvault can ensure this with a single pane of glass solution – whether in a private cloud, a Service Provider delivered public cloud or a hybrid cloud environment – across the heterogeneous enter...
The cloud promises new levels of agility and cost-savings for Big Data, data warehousing and analytics. But it’s challenging to understand all the options – from IaaS and PaaS to newer services like HaaS (Hadoop as a Service) and BDaaS (Big Data as a Service). In her session at @BigDataExpo at @ThingsExpo, Hannah Smalltree, a director at Cazena, provided an educational overview of emerging “as-a-service” options for Big Data in the cloud. This is critical background for IT and data professionals...
@GonzalezCarmen has been ranked the Number One Influencer and @ThingsExpo has been named the Number One Brand in the “M2M 2016: Top 100 Influencers and Brands” by Onalytica. Onalytica analyzed tweets over the last 6 months mentioning the keywords M2M OR “Machine to Machine.” They then identified the top 100 most influential brands and individuals leading the discussion on Twitter.
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...