Welcome!

Linux Authors: Pat Romanski, Liz McMillan, Elizabeth White, Ignacio M. Llorente, Trevor Parsons

Related Topics: Open Source, Java, Linux, Eclipse, Apache, OpenStack Journal

Open Source: Article

The Science and Art of Open Source Software License Management

Protect your organization from risk while ensuring continuous innovation

The industrial revolution continues - starting with the steam engines of the 18th century, continuing with large-scale steel production, oil exploitation, electrical and photographic innovations of the 19th century, and moving on to the transportation, communications, computation and electronics of the 20th century. It is still early in the 21st century, but we can safely say software has become the engine that feeds the industrial, economic, medical, and gradually the political issues of our existence. The only way to satisfy the demand for the volume and complexity of the software that is needed to keep our world moving is to maximally share and reuse code within and across application domains.

Open Source Software (OSS) is the epitome of code reuse, enabling complex applications to be realized rapidly, economically and safely. Probably the largest collaborative endeavor in human kind to date, open source feeds on itself. Independent studies have converged on the fact that open source is everywhere, and depending on the source of the study, 80-100% of all software organizations now use open source software in their products or operations.

There is an implicit understanding that good developers do not write code from scratch any more. Rather, they can adapt a piece of existing code to furnish a desired function. Use of off-the-shelf code such as OSS brings the usual due diligence and precautions associated with deployment of any third-party content within an organization. The pedigree of the code, its ownership attributes, and the rules around the use of open source code (typically captured in a license document) govern its introduction within an organization and its suitability for an end-target use.

Open source software brings with it an unusual set of ownership issues. Unlike other commodities, open source code can be brought into an organization freely. While anything that is purchased in a transaction has implied ownership, ownership and usage of OSS can be confusing to many developers or organizations. Generally, the copyright ownership of OSS always stays with the creator of the open source code. The OSS copyright owner creates and communicates a license that explicitly sets out the rules governing the use of that open source software.

Continuous Versus One-Time OSS Assessment
Detecting and complying with OSS code in a software project has become an important part of a quality and governance process in organizations that create or consume software.

Product quality considerations and standards require that a recorded knowledge of all the third-party components within a product be maintained at all times. These records should also include attributes such as pedigree, defect history and improvements over time, potential vulnerabilities, and code propagation within the organization. These records can be best maintained, not by a one-time examination of the organization's code portfolio, but through an ongoing and structured third-party and open source software adoption process. The practice of creating and updating the records automatically as development proceeds ensures ongoing compliance with the requirements of a quality organization.

As opposed to the continuous recordkeeping requirements of a quality process, a software audit is a one-time activity targeted at providing insight into the intellectual property (IP) ownership or IP rights, in anticipation of a transaction such as an M&A or a product shipment to market.

Open source software audits, generally carried out by an external body, involves an examination of a software portfolio in order to detect OSS and third-party code within that portfolio. The result of the audit is a report which highlights open source and third party components and their attributes. At a high level, statistics such as the names of any public-domain software packages and whether they are used in a modified or unmodified format, composition of the license mix, copyrights, vulnerabilities, languages, and open source lines of code are provided. At a more detailed level, specific open source or proprietary packages that were discovered, their license attributes, links to resources that contain additional information, text of the licenses, copyrights, and know security vulnerabilities associated with the components of the software are provided.

An audit process would highlight code that is specifically copyrighted, but for which no license is offered or mentioned. These cases are one of the challenging aspects in establishing IP ownership, as the copyright owner must be contacted for explicit permission to use their code. Also, any code that is not in the public domain and has no identifying information, such as headers, must be highlighted as requiring further investigation.

The Science: Detecting Third-Party Packages in a Portfolio
Once all OSS and other third-party packages are identified and a software Bill of Materials (BoM) is available, then the list can be examined for properties such as licenses, known deficiencies, security vulnerabilities, various obligations associated with their use, and functions such as encryption that could restrict its use in certain markets.

A number of methods can be used to identify open source and commercial software within a software portfolio. A short list of these methods will include the following.

  • Records by developers: Any records maintained by developers and development managers will assist the process.
  • Information held within a file, or folder: A quality-development practice is to include a header on every file that holds information about the software package, organization, copyright and license associated with the file or the package. Often binaries also include identifying information, such as copyright owner, project name, and the license pointers within the file. Another quality practice is to include licensing or other information about a software package within the package.
  • File/folder names and paths: These could be additional indicators of the presence of a known public domain or commercial software.
  • Similarity with public-domain software: Any similarity between a software file, or portions of a software file, and a file in public domain could accurately indicate the presence of OSS in a portfolio. A full-file code similarity to a public domain file would indicate unmodified use of the OSS software. A partial code match would indicate use of OSS in a modified form. This is significant because many open source licenses trigger different obligations based on how the file is used.

There are hundreds of thousands of public-domain projects accessible to developers. When you consider that an OSS project can have multiple versions in the public domain and each package can consist of anything between two and 200,000 files, we gain an appreciation for the task involved in this method. Manual identification of code similarity to millions of files is obviously impractical. Only intelligent automated solutions can go through a software portfolio and examine similarity between each and every file in that portfolio and software files in public domain.

The Art: Reading Between the Lines
The methods described above could theoretically provide insight into composition of a code portfolio. However, those methods alone are not sufficient to reveal an accurate view of the code composition.

  • Manual records are the least reliable method as third-party content is often brought into a project without registering a record. Also, in today's typical development environment it is very difficult to guarantee access to the original developer's piece of software.
  • File header information is not necessarily an accurate representation of the file pedigree and license. These can be changed by a developer, or automatically as in Linux kernel header files that were used in Android packages.
  • Almost all OSS uses OSS. For example, there are more than 65,000 instances of commons.logging (a popular Apache logging layer), and more than 50,000 instances each of Log4j (another Apache logging utility for Java) and JUnit (a popular testing framework for Java) code in various public-domain projects. This leads to OSS project and license nesting complexities and contributes to challenges in correctly identifying the OSS packages within a portfolio. It is critical to detect the genesis OSS project and the version of the original and framing OSS projects. Practical examples of this challenge are:
    • Open source packages that use other OSS but do not maintain or propagate the original OSS license. This action may not be legitimate depending on the original OSS license and the license of the derived OSS package and can lead to erroneous conclusions about the quality as well as usage obligations associated with the organization's software.
    • Open source packages that change the license in subsequent versions such as moving from one version of a license. For example, GPLv2 to GPLv3 on a new release of the OSS project or moving from one license to another compatible license.
    • Projects where the OSS file or folder name is modified. This happens regularly on libraries (those with .jar or .bin extensions), and less frequently on source code files.
    • Seemingly known but non-existent licenses mentioned within a file or folder. A prime example is an OSS package that claims it's released under GPL. There is no GPL, only versioned GPL such as GPL v1, v2 licenses are available.

The art of OSS audit activity and license management relies on a clear understanding of the open source software community, open source packages, open source licenses, and development practices. This understanding comes with both academic knowledge as well as experience in scanning, reviewing, and auditing hundreds of software portfolios. Automated solutions that combine the science of scanning and license management with empirical methods that embody the art of open source package detection and license discovery can significantly speed up the discovery and management process and minimize, although not eliminate, the human involvement factor.

Conclusion
Rapid software development is necessary for sustaining the pace of innovation needed in today's world and the use of OSS is perhaps the best way to maintain this pace. While most organizations are now using open source to their advantage, they must avoid potential complications that OSS license obligations present. The complexity that OSS licenses present makes it almost impossible to manage obligations manually. This is where automated solutions come in. Conducting a one-time audit, or preferably, having a continuous process in place to automatically detect OSS licenses and their obligations is the best way to protect your organization from risk while ensuring continuous innovation.

More Stories By Kamyar Emami

Kamyar Emami has 20+ years of international technology and business experience in transportation, telecommunications, and the oil and gas industries. He is currently the COO of Protecode (www.protecode.com), and oversees the development of the company’s open source license management tools.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
Scott Jenson leads a project called The Physical Web within the Chrome team at Google. Project members are working to take the scalability and openness of the web and use it to talk to the exponentially exploding range of smart devices. Nearly every company today working on the IoT comes up with the same basic solution: use my server and you'll be fine. But if we really believe there will be trillions of these devices, that just can't scale. We need a system that is open a scalable and by using the URL as a basic building block, we open this up and get the same resilience that the web enjoys.
Connected devices and the Internet of Things are getting significant momentum in 2014. In his session at Internet of @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, examined three key elements that together will drive mass adoption of the IoT before the end of 2015. The first element is the recent advent of robust open source protocols (like AllJoyn and WebRTC) that facilitate M2M communication. The second is broad availability of flexible, cost-effective storage designed to handle the massive surge in back-end data in a world where timely analytics is e...
We are reaching the end of the beginning with WebRTC, and real systems using this technology have begun to appear. One challenge that faces every WebRTC deployment (in some form or another) is identity management. For example, if you have an existing service – possibly built on a variety of different PaaS/SaaS offerings – and you want to add real-time communications you are faced with a challenge relating to user management, authentication, authorization, and validation. Service providers will want to use their existing identities, but these will have credentials already that are (hopefully) i...
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
How do APIs and IoT relate? The answer is not as simple as merely adding an API on top of a dumb device, but rather about understanding the architectural patterns for implementing an IoT fabric. There are typically two or three trends: Exposing the device to a management framework Exposing that management framework to a business centric logic Exposing that business layer and data to end users. This last trend is the IoT stack, which involves a new shift in the separation of what stuff happens, where data lives and where the interface lies. For instance, it's a mix of architectural styles ...
The Internet of Things will put IT to its ultimate test by creating infinite new opportunities to digitize products and services, generate and analyze new data to improve customer satisfaction, and discover new ways to gain a competitive advantage across nearly every industry. In order to help corporate business units to capitalize on the rapidly evolving IoT opportunities, IT must stand up to a new set of challenges. In his session at @ThingsExpo, Jeff Kaplan, Managing Director of THINKstrategies, will examine why IT must finally fulfill its role in support of its SBUs or face a new round of...
Cultural, regulatory, environmental, political and economic (CREPE) conditions over the past decade are creating cross-industry solution spaces that require processes and technologies from both the Internet of Things (IoT), and Data Management and Analytics (DMA). These solution spaces are evolving into Sensor Analytics Ecosystems (SAE) that represent significant new opportunities for organizations of all types. Public Utilities throughout the world, providing electricity, natural gas and water, are pursuing SmartGrid initiatives that represent one of the more mature examples of SAE. We have s...
The Internet of Things will greatly expand the opportunities for data collection and new business models driven off of that data. In her session at @ThingsExpo, Esmeralda Swartz, CMO of MetraTech, discussed how for this to be effective you not only need to have infrastructure and operational models capable of utilizing this new phenomenon, but increasingly service providers will need to convince a skeptical public to participate. Get ready to show them the money!
One of the biggest challenges when developing connected devices is identifying user value and delivering it through successful user experiences. In his session at Internet of @ThingsExpo, Mike Kuniavsky, Principal Scientist, Innovation Services at PARC, described an IoT-specific approach to user experience design that combines approaches from interaction design, industrial design and service design to create experiences that go beyond simple connected gadgets to create lasting, multi-device experiences grounded in people's real needs and desires.
P2P RTC will impact the landscape of communications, shifting from traditional telephony style communications models to OTT (Over-The-Top) cloud assisted & PaaS (Platform as a Service) communication services. The P2P shift will impact many areas of our lives, from mobile communication, human interactive web services, RTC and telephony infrastructure, user federation, security and privacy implications, business costs, and scalability. In his session at @ThingsExpo, Robin Raymond, Chief Architect at Hookflash, will walk through the shifting landscape of traditional telephone and voice services ...
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, discussed single-value, geo-spatial, and log time series data. By focusing on enterprise applications and the data center, he will use OpenTSDB as an example t...
The Domain Name Service (DNS) is one of the most important components in networking infrastructure, enabling users and services to access applications by translating URLs (names) into IP addresses (numbers). Because every icon and URL and all embedded content on a website requires a DNS lookup loading complex sites necessitates hundreds of DNS queries. In addition, as more internet-enabled ‘Things' get connected, people will rely on DNS to name and find their fridges, toasters and toilets. According to a recent IDG Research Services Survey this rate of traffic will only grow. What's driving t...
Enthusiasm for the Internet of Things has reached an all-time high. In 2013 alone, venture capitalists spent more than $1 billion dollars investing in the IoT space. With "smart" appliances and devices, IoT covers wearable smart devices, cloud services to hardware companies. Nest, a Google company, detects temperatures inside homes and automatically adjusts it by tracking its user's habit. These technologies are quickly developing and with it come challenges such as bridging infrastructure gaps, abiding by privacy concerns and making the concept a reality. These challenges can't be addressed w...
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at Internet of @ThingsExpo, James Kirkland, Chief Architect for the Internet of Things and Intelligent Systems at Red Hat, described how to revolutioniz...
Bit6 today issued a challenge to the technology community implementing Web Real Time Communication (WebRTC). To leap beyond WebRTC’s significant limitations and fully leverage its underlying value to accelerate innovation, application developers need to consider the entire communications ecosystem.
The definition of IoT is not new, in fact it’s been around for over a decade. What has changed is the public's awareness that the technology we use on a daily basis has caught up on the vision of an always on, always connected world. If you look into the details of what comprises the IoT, you’ll see that it includes everything from cloud computing, Big Data analytics, “Things,” Web communication, applications, network, storage, etc. It is essentially including everything connected online from hardware to software, or as we like to say, it’s an Internet of many different things. The difference ...
Cloud Expo 2014 TV commercials will feature @ThingsExpo, which was launched in June, 2014 at New York City's Javits Center as the largest 'Internet of Things' event in the world.
SYS-CON Events announced today that Windstream, a leading provider of advanced network and cloud communications, has been named “Silver Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York, NY. Windstream (Nasdaq: WIN), a FORTUNE 500 and S&P 500 company, is a leading provider of advanced network communications, including cloud computing and managed services, to businesses nationwide. The company also offers broadband, phone and digital TV services to consumers primarily in rural areas.
"There is a natural synchronization between the business models, the IoT is there to support ,” explained Brendan O'Brien, Co-founder and Chief Architect of Aria Systems, in this SYS-CON.tv interview at the 15th International Cloud Expo®, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
The major cloud platforms defy a simple, side-by-side analysis. Each of the major IaaS public-cloud platforms offers their own unique strengths and functionality. Options for on-site private cloud are diverse as well, and must be designed and deployed while taking existing legacy architecture and infrastructure into account. Then the reality is that most enterprises are embarking on a hybrid cloud strategy and programs. In this Power Panel at 15th Cloud Expo (http://www.CloudComputingExpo.com), moderated by Ashar Baig, Research Director, Cloud, at Gigaom Research, Nate Gordon, Director of T...