Click here to close now.


Linux Containers Authors: Liz McMillan, Elizabeth White, Pat Romanski, Yeshim Deniz, SmartBear Blog

Related Topics: Linux Containers

Linux Containers: Article

Trusting Computing on Linux

Building a trusted platform

In an era where everybody is connected to a potentially harmful Internet with an increasing number of complex and distributed applications, controlling what the computers do has become significantly harder. At the core, simple actions (executing software, e-commerce, etc.) rely on trust relationships; what if your computer (or the merchant's) has been compromised and alters your perception of reality? Indeed, at the beginning, Neo did not know there was a Matrix because he trusted everything he saw...

Closer to our world, and without being paranoid, one of the first actions intruders or rootkits take is to replace common commands with fake ones. Is it then possible to guarantee that we'll really execute the code we intended to? How far can you trust the computer of a given merchant not to reveal your credit card number? This is precisely what trusted computing is about: providing the means to know how much a given machine may be trusted.

Actually, the use of chips to enforce security within the lowest layers isn't new; it's existed for many years. However, their high price, difficult integration with commercial software, and high impact on systems' performances has restricted their use to the mainstream industry.

Several major industrials decided to join their efforts and design a compromise that would meet market needs. The idea was to build a trusted platform, including a new security chip, that would be easier to use and with more computational power, but perhaps a little less secure. They first gave birth to the TCPA (Trusted Computing Platform Alliance) in 1999, and then to its successor, TCG (Trusted Computing Group), in 2003.

Trusted Computing Group

The primary goal of the TCG is to provide the industry with vendor-neutral standard specifications for various platforms (PC, PDA, mobile phone, etc.). To do so, they describe a subsystem to integrate onto each platform and that provides protection to a user's computing environment, and information and keys to operating systems or applications. More precisely, TCG's proposed subsystem consists of a Trusted Platform Module (TPM) and the TPM Software Stack (TSS).

The TPM is a hardware chip. It provides low-level trusted computing functionalities such as protected storage (making sure encryption keys cannot be retrieved even if the platform is compromised), integrity metrics (detecting compromise), and platform attestation (prove to others that the platform has a given property).

As for the TSS, it's organized as shown in Figure 1:

  • A TPM device driver, typically provided by the TPM manufacturer
  • An abstraction layer to TPM drivers, the TDDL, which makes it possible to develop upper components in the stack independent of the TPM chip
  • A core services layer (TCS) that groups all common services to the software stack, such as event management, key and credentials management, etc.
  • Various TSS Service Providers (TSP) that, for example, offer access to specific APIs such as PKCS#11
To illustrate the possible benefits of using trusted compu-ting, let's describe a simple case where a system administrator needs to secure an employee's laptop access to the corporate network. The employee accesses his or her company's network using a secret key and specific network access software (e.g., a VPN client). The problem is that the employee's laptop is obviously untrusted; it's carried everywhere and unfortunately is the ideal target for viruses or any other malware. If a laptop's corporate network access software and/or the secret key are compromised, this may seriously impact corporate security.

To avoid such a scenario, a possible solution relies on trusted computing. The administrator uses the TPM to seal the secret key with the BIOS, OS, and the network access software. This cryptographically binds the keys to a given software stack, so that only the TPM may unseal the key if and only if the software stack (BIOS, OS, network-access software) has not been compromised. This virtually establishes trust on an untrusted platform.

Linux Support for TCG

In practice, TPMs are already well established on the market, although perhaps not that widely yet. Several chip manufacturers propose TPM chips (e.g., Infineon's SLD 9630 TT or Atmel's AT97SC3201). Intel has developed TPM-integrated boards (D865GRH, D915GEV, and D915GUX desktop boards). TPMs are even sold on a specific desktop or laptop series (IBM ThinkCentre, HP Compaq DC7100, Toshiba Tecra M2, Fujitsu Lifebook S, etc.). The real difficulty in getting your hands on TCG arises later, within the TPM Software Stack. Indeed, mainstream Linux kernels do not natively recognize TPM chips, and solutions to use them are nearly nonexistent at the moment.

With Linux, we are presently only aware of NTRU's TSS and a few research projects listed in Table 1. Most of those are highly experimental, with only limited support of TPM chips and a selected subset of TCG functions. Clearly, this is currently only a developer's or an expert's world; there is no way an end user can benefit from TCG's functionalities without getting into the source code.

TCG and Linux

Actually, trusted computing's first exposure to the public has been quite controversial. Basically, people worried that this technology would scorn privacy or block software interoperability. Others even exposed startling side effects. The reality is probably somewhat more balanced, and we dare to compare trusted computing to a Swiss army knife: it can be extremely useful for getting out of (dangerous?) situations, but obviously it may be lethal.

It's beyond the scope of this article to tackle privacy and TCG issues in more detail, though we invite interested readers to refer to the resources section for further readings.

Whether we want it or not, trusted computing seems to be a part of the future for many commercial systems. Support for TCG is already part of the requirements for some industrial Linux systems. Market perspective looks extremely promising; indeed, there are still several research and development opportunities:

  • At the hardware level, by introducing new trusted hardware on the market (see for instance, Intel's trusted keyboard controller).
  • At the operating system level, with a new "trusted" OS making use of trusted hardware. This would probably consist of a kernel module but with a broader link to the OS.
  • At the application level, with numerous use cases for end-user "trusted" applications, but barely any implementation on Linux yet.
  • In the area of embedded systems - for example, mobile phones, PDAs, or other devices.


Currently, the best way to qualify TCG's penetration in the market is moderate: the TPM chips are already on the market, but their software stack is extremely limited and experimental. Yet, whatever your rationale is - for or against TCG technology - with the widespread propagation of viruses and other malware, and the ever-increasing security needs of the industry, trusted computing seems an extremely promising technology and TPM chips are very likely to be deployed more frequently on systems around us. It would then be extremely positive for the Linux community - and more generally the open source community - to get involved. Indeed, how much and how well TPMs are supported and integrated could become a selection criteria among operating systems in the future.


  • Yee, B. "Using Secure Coprocessors", PhD Thesis, CMUCS94149, May 1994:
  • Arnold, T. W., and van Doorn, L. P. "The IBM PCIXCC: A new cryptographic coprocessor for the IBM eServer." IBM Research & Development Journal. Vol. 48, No. 3. May/July 2004.
  • Trusted Computing Group:
  • NTRU Core TCG Software Stack (CTSS):
  • Safford, D. "TCPA Resources":
  • Sailer, R.; Jaeger, T.; van Doorn, L.; Zheng, X. "TPM based Linux Runtime Attestation":
  • Wild, O., and Marchesini, J. "Enforcer":
  • Sevinc, P.E. "A Software-based TPM Emulator for Linux":
  • Selhorst, M., and Stueble, C. "Linux Kernel Module for the Infineon Trusted Platform Module SLD 9630 TT":
  • Anderson, R. "Trusted Computing - Frequently Asked Questions", version 1.1. August 2003:
  • Schechter, S.E.; Greenstadt, R.A.; and Smith, M.D. "Trusted Computing, Peer to Peer Distribution, and the Economics of Pirated Entertainment." Second Workshop on Economics and Information Society, May 29, 2003:
  • Carrier Grade Linux Hardware Requirements definition version 3:
  • Bajikar, S. "Trusted Mobile Keyboard Controller Architecture." Intel Developers Forum. Fall 2003:
  • Wave Systems, Embassy Trust Suite:
  • Linux Devices. January 22, 2003:
  • Walko, J. "ARM links with Trusted Logic for secure mobile, set tops." July 14, 2004:
  • More Stories By Makan Pourzandi

    Makan Pourzandi received his doctoral degree on parallel computing in 1995 from the University of Lyon, France. He works for Ericsson Research
    Canada in the Open Systems Research Department. His research domains are security, cluster computing, and component-based methods for
    distributed programming. He has more than 7 publications in International conferences with reference committees. Makan has delivered several talks
    at universities, international conferences, and Open Source forums. He is involved in several Open Source projects: Distributed Security
    Infrastructure (, and a contributer to the
    security requirements of the Open Source Development Lab (OSDL) Carrier Grade Linux (CGL).

    More Stories By Axelle Apvrille

    Axelle Apvrille currently works for Ericsson Research Canada in the Open Systems Research Department. Her
    research interests are cryptography, security protocols and distributed
    security. She received her computer science engineering degree in 1996
    at ENSEIRB, Bordeaux, France.

    Comments (0)

    Share your thoughts on this story.

    Add your comment
    You must be signed in to add a comment. Sign-in | Register

    In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

    @ThingsExpo Stories
    SYS-CON Events announced today that Luxoft Holding, Inc., a leading provider of software development services and innovative IT solutions, has been named “Bronze Sponsor” of SYS-CON's @ThingsExpo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Luxoft’s software development services consist of core and mission-critical custom software development and support, product engineering and testing, and technology consulting.
    “In the past year we've seen a lot of stabilization of WebRTC. You can now use it in production with a far greater degree of certainty. A lot of the real developments in the past year have been in things like the data channel, which will enable a whole new type of application," explained Peter Dunkley, Technical Director at Acision, in this interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
    NHK, Japan Broadcasting, will feature the upcoming @ThingsExpo Silicon Valley in a special 'Internet of Things' and smart technology documentary that will be filmed on the expo floor between November 3 to 5, 2015, in Santa Clara. NHK is the sole public TV network in Japan equivalent to the BBC in the UK and the largest in Asia with many award-winning science and technology programs. Japanese TV is producing a documentary about IoT and Smart technology and will be covering @ThingsExpo Silicon Valley. The program, to be aired during the peak viewership season of the year, will have a major impac...
    Developing software for the Internet of Things (IoT) comes with its own set of challenges. Security, privacy, and unified standards are a few key issues. In addition, each IoT product is comprised of at least three separate application components: the software embedded in the device, the backend big-data service, and the mobile application for the end user's controls. Each component is developed by a different team, using different technologies and practices, and deployed to a different stack/target - this makes the integration of these separate pipelines and the coordination of software upd...
    "Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
    You have your devices and your data, but what about the rest of your Internet of Things story? Two popular classes of technologies that nicely handle the Big Data analytics for Internet of Things are Apache Hadoop and NoSQL. Hadoop is designed for parallelizing analytical work across many servers and is ideal for the massive data volumes you create with IoT devices. NoSQL databases such as Apache HBase are ideal for storing and retrieving IoT data as “time series data.”
    There are so many tools and techniques for data analytics that even for a data scientist the choices, possible systems, and even the types of data can be daunting. In his session at @ThingsExpo, Chris Harrold, Global CTO for Big Data Solutions for EMC Corporation, will show how to perform a simple, but meaningful analysis of social sentiment data using freely available tools that take only minutes to download and install. Participants will get the download information, scripts, and complete end-to-end walkthrough of the analysis from start to finish. Participants will also be given the pract...
    Clearly the way forward is to move to cloud be it bare metal, VMs or containers. One aspect of the current public clouds that is slowing this cloud migration is cloud lock-in. Every cloud vendor is trying to make it very difficult to move out once a customer has chosen their cloud. In his session at 17th Cloud Expo, Naveen Nimmu, CEO of Clouber, Inc., will advocate that making the inter-cloud migration as simple as changing airlines would help the entire industry to quickly adopt the cloud without worrying about any lock-in fears. In fact by having standard APIs for IaaS would help PaaS expl...
    WebRTC is about the data channel as much as about video and audio conferencing. However, basically all commercial WebRTC applications have been built with a focus on audio and video. The handling of “data” has been limited to text chat and file download – all other data sharing seems to end with screensharing. What is holding back a more intensive use of peer-to-peer data? In her session at @ThingsExpo, Dr Silvia Pfeiffer, WebRTC Applications Team Lead at National ICT Australia, will look at different existing uses of peer-to-peer data sharing and how it can become useful in a live session to...
    SYS-CON Events announced today that ProfitBricks, the provider of painless cloud infrastructure, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. ProfitBricks is the IaaS provider that offers a painless cloud experience for all IT users, with no learning curve. ProfitBricks boasts flexible cloud servers and networking, an integrated Data Center Designer tool for visual control over the cloud and the best price/performance value available. ProfitBricks was named one of the coolest Clo...
    Organizations already struggle with the simple collection of data resulting from the proliferation of IoT, lacking the right infrastructure to manage it. They can't only rely on the cloud to collect and utilize this data because many applications still require dedicated infrastructure for security, redundancy, performance, etc. In his session at 17th Cloud Expo, Emil Sayegh, CEO of Codero Hosting, will discuss how in order to resolve the inherent issues, companies need to combine dedicated and cloud solutions through hybrid hosting – a sustainable solution for the data required to manage I...
    Mobile messaging has been a popular communication channel for more than 20 years. Finnish engineer Matti Makkonen invented the idea for SMS (Short Message Service) in 1984, making his vision a reality on December 3, 1992 by sending the first message ("Happy Christmas") from a PC to a cell phone. Since then, the technology has evolved immensely, from both a technology standpoint, and in our everyday uses for it. Originally used for person-to-person (P2P) communication, i.e., Sally sends a text message to Betty – mobile messaging now offers tremendous value to businesses for customer and empl...
    Nowadays, a large number of sensors and devices are connected to the network. Leading-edge IoT technologies integrate various types of sensor data to create a new value for several business decision scenarios. The transparent cloud is a model of a new IoT emergence service platform. Many service providers store and access various types of sensor data in order to create and find out new business values by integrating such data.
    SYS-CON Events announced today that IBM Cloud Data Services has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. IBM Cloud Data Services offers a portfolio of integrated, best-of-breed cloud data services for developers focused on mobile computing and analytics use cases.
    Scott Guthrie's keynote presentation "Journey to the intelligent cloud" is a must view video. This is from AzureCon 2015, September 29, 2015 I have reproduced some screen shots in case you are unable to view this long video for one reason or another. One of the highlights is 3 datacenters coming on line in India.
    Apps and devices shouldn't stop working when there's limited or no network connectivity. Learn how to bring data stored in a cloud database to the edge of the network (and back again) whenever an Internet connection is available. In his session at 17th Cloud Expo, Bradley Holt, Developer Advocate at IBM Cloud Data Services, will demonstrate techniques for replicating cloud databases with devices in order to build offline-first mobile or Internet of Things (IoT) apps that can provide a better, faster user experience, both offline and online. The focus of this talk will be on IBM Cloudant, Apa...
    As a company adopts a DevOps approach to software development, what are key things that both the Dev and Ops side of the business must keep in mind to ensure effective continuous delivery? In his session at DevOps Summit, Mark Hydar, Head of DevOps, Ericsson TV Platforms, will share best practices and provide helpful tips for Ops teams to adopt an open line of communication with the development side of the house to ensure success between the two sides.
    The enterprise is being consumerized, and the consumer is being enterprised. Moore's Law does not matter anymore, the future belongs to business virtualization powered by invisible service architecture, powered by hyperscale and hyperconvergence, and facilitated by vertical streaming and horizontal scaling and consolidation. Both buyers and sellers want instant results, and from paperwork to paperless to mindless is the ultimate goal for any seamless transaction. The sweetest sweet spot in innovation is automation. The most painful pain point for any business is the mismatch between supplies a...
    As more and more data is generated from a variety of connected devices, the need to get insights from this data and predict future behavior and trends is increasingly essential for businesses. Real-time stream processing is needed in a variety of different industries such as Manufacturing, Oil and Gas, Automobile, Finance, Online Retail, Smart Grids, and Healthcare. Azure Stream Analytics is a fully managed distributed stream computation service that provides low latency, scalable processing of streaming data in the cloud with an enterprise grade SLA. It features built-in integration with Azur...
    WebRTC: together these advances have created a perfect storm of technologies that are disrupting and transforming classic communications models and ecosystems. In his session at WebRTC Summit, Cary Bran, VP of Innovation and New Ventures at Plantronics and PLT Labs, will provide an overview of this technological shift, including associated business and consumer communications impacts, and opportunities it may enable, complement or entirely transform.