Linux Containers Authors: Elizabeth White, Liz McMillan, Pat Romanski, XebiaLabs Blog, Mehdi Daoudi

Related Topics: Linux Containers

Linux Containers: Article

"Father of MINIX," Andy Tanenbaum, on Kenneth Brown's Claims Re Linux

Ken Brown's Motivation, Release 2.1


On 20 May 2004, I posted a statement refuting the claim of Ken Brown, President of the Alexis de Tocqueville Institution, that Linus Torvalds didn't write Linux. My statement was mentioned on Slashdot, Groklaw, and many other Internet news sites. This attention resulted in over 150,000 requests to our server in less than a day, which is still standing despite yesterday being a national holiday with no one there to stand next to it saying "You can do it. You can do it." Kudos to Sun Microsystems and the folks who built Apache. My statement was mirrored all over the Internet, so the number of true hits to it is probably a substantial multiple of that. There were also quite a few comments at Slashdot, Groklaw, and other sites, many of them about me. I had never engaged in remote multishrink psychoanalysis on this scale before, so it was a fascinating experience.

The Brown Book

I got an advance copy of Ken Brown's book. I think it is still under embargo, so I won't comment on it. Although I am not an investigative reporter, even I know it is unethical to discuss publications still under embargo. Some of us take ethics more seriously than others. So I won't even reveal the title. Let's call it The Brown Book. There is some precedent for nicknaming books after colors: The International Standard for the audio CD (IS 10149) is usually called The Red Book. The CD-ROM was described in the Yellow Book. Suffice it to say, there is a great deal to criticize in the book. I am sure that will happen when it is published. I may even help out.

Brown's Motivation

What prompted me to write this note today is an e-mail I got yesterday. Actually, I got quite a few :-) , most of them thanking me for the historical material. One of yesterday's e-mails was from Linus, in response to an e-mail from me apologizing for not letting him see my statement in advance. As a matter of courtesy, I did try but I was using his old transmeta.com address and didn't know his new one until I got a very kind email from Linus' father, a Finnish journalist.

In his e-mail, Linus said that Brown never contacted him. No e-mail, no phone call, no personal interview. Nothing. Considering the fact that Brown was writing an explosive book in which he accused Linus of not being the author of Linux, you would think a serious author would at least confront the subject with the accusation and give him a chance to respond. What kind of a reporter talks to people on the periphery of the subject but fails to talk to the main player?

Why did Brown fly all the way to Europe to interview me and (and according to an e-mail I got from his seat-mate on the plane) one other person in Scandinavia, at considerable expense, and not at least call Linus? Even if he made a really bad choice of phone company, how much could that cost? Maybe a dollar? I call the U.S. all the time from Amsterdam. It is less than 5 cents a minute. How much could it cost to call California from D.C.?

From reading all the comments posted yesterday, I am now beginning to get the picture. Apparently a lot of people (still) think that I 'hate' Linus for stealing all my glory (see below for more on this). I didn't realize this view was so widespread. I now suspect that Brown believed this, too, and thought that I would be happy to dump all over Linus to get 'revenge.' By flying to Amsterdam he thought he could dig up dirt on Linus and get me to speak evil of him. He thought I would back up his crazy claim that Linus stole Linux from me. Brown was wrong on two counts. First, I bear no 'grudge' against Linus at all. He wrote Linux himself and deserves the credit. Second, I am really not a mean person. Even if I were still angry with him after all these years, I wouldn't choose some sleazy author with a hidden agenda as my vehicle. My home page gets 2500 hits a week. If I had something to say, I could put it there.

When The Brown Book comes out, there will no doubt be a lot of publicity in the mainstream media. Any of you with contacts in the media are actively encouraged to point reporters to this page and my original statement to provide some balance. I really think Brown's motivation should come under scrutiny. I don't believe for a nanosecond that Brown was trying to do a legitimate study of IP and open source or anything like that. I think he was trying to make the case the people funding him (which he refused to disclose to me despite my asking point blank) wanted to have made. Having an institution with an illustrious-sounding name make the case looks better than having an interested party make the case.

Clearing Up Some Misconceptions

I would like to close by clearing up a few misconceptions and also correcting a couple of errors. First, I REALLY am not angry with Linus. HONEST. He's not angry with me either. I am not some kind of "sore loser" who feels he has been eclipsed by Linus. MINIX was only a kind of fun hobby for me. I am a professor. I teach and do research and write books and go to conferences and do things professors do. I like my job and my students and my university. If you want to get a masters there, see my home page for information. I wrote MINIX because I wanted my students to have hands-on experience playing with an operating system. After AT&T forbade teaching from John Lions book, I decided to write a UNIX-like system for my students to play with. Since I had already written two books at this point, one on computer architecture and one on computer networks, it seemed reasonable to describe the system in a new book on operating systems, which is what I did. I was not trying to replace GNU/HURD or Berkeley UNIX. Heaven knows, I have said this enough times. I just wanted to show my students and other students how you could write a UNIX-like system using modern technology. A lot of other people wanted a free production UNIX with lots of bells and whistles and wanted to convert MINIX into that. I was dragged along in the maelstrom for a while, but when Linux came along, I was actually relieved that I could go back to professoring. I never really applied for the position of King of the Hackers and didn't want the job when it was offered. Linus seems to be doing excellent work and I wish him much success in the future.

While writing MINIX was fun, I don't really regard it as the most important thing I have ever done. It was more of a distraction than anything else. The most important thing I have done is produce a number of incredibly good students, especially Ph.D. students. See my home page for the list. They have done great things. I am as proud as a mother hen. To the extent that Linus can be counted as my student, I'm proud of him, too. Professors like it when their students go on to greater glory. I have also written over 100 published research papers and 14 books which have been translated into about 20 languages. As a result I have become a Fellow of the IEEE, a Fellow of the ACM, and won numerous other awards. For me, these are the things that really count. If MINIX had become a big 'commercial' success I wouldn't have had the time to do all this academic stuff that I am actually more interested in.

Microkernels Revisited

I can't resist saying a few words about microkernels. A microkernel is a very small kernel. If the file system runs inside the kernel, it is NOT a microkernel. The microkernel should handle low-level process management, scheduling, interprocess communication, interrupt handling, and the basics of memory management and little else. The core microkernel of MINIX 1.0 was under 1400 lines of C and assembler. To that you have to add the headers and device drivers, but the totality of everything that ran in kernel mode was under 5000 lines. Microsoft claimed that Windows NT 3.51 was a microkernel. It wasn't. It wasn't even close. Even they dropped the claim with NT 4.0. Some microkernels have been quite successful, such as QNX and L4. I can't for the life of me see why people object to the 20% performance hit a microkernel might give you when they program in languages like Java and Perl where you often get a factor 20x performance hit. What's the big deal about turning a 3.0 GHz PC into a 2.4 GHz PC due to a microkernel? Surely you once bought a machine appreciably slower than 2.4 GHz and were very happy with it. I would easily give up 20% in performance for a system that was robust, reliable, and wasn't susceptible to many of the ills we see in today's massive operating systems.


I would now like to correct an error in my original statement. One of the e-mails I got yesterday clarified the origins of Coherent. It was not written by Bob Swartz. He was CEO of the Mark Williams Company. Three ex-students from the University of Waterloo, Dave Conroy, Randall Howard, and Johann George, did most of the work. Waterloo is in Canada, where they also play baseball I am told, but only after the ice melts and they can't play hockey. It took the Waterloo students something like 6 man-years to produce Coherent, but this included the kernel, the C compiler, the shell, and ALL the utilities. The kernel is only a tiny fraction of the total code, so it may well be that the kernel itself took a man year. It took me three years to write MINIX, but I was only working at it only in the evenings, and I also wrote 400 pages of text describing the code in that time period (also in the evenings). I think a good programmer can write a 12,000 line kernel in a year.

If you have made it this far, thank you for your time.

Andy Tanenbaum, 21 May 2004

More Stories By Andrew S. Tanenbaum

Andy Tanenbaum is Professor of Computer Science at Vrije University, in Amsterdam, The Netherlands. He wrote MINIX as an appendix or example in the book "Operating Systems: Design and Implementation." An abridged 12000 lines of source code of the kernel, memory manager, and file system is printed in the book; it is mostly written in C.

Comments (10) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

Most Recent Comments
Randall Howard 08/16/08 10:15:19 PM EDT

Oops! December 2000 should read "December 1980" - can't figure out how to edit after posting

Randall Howard 08/16/08 10:10:21 PM EDT

Although this article is now four years old, I just stumbled on it doing a different search. I'm one of the three writers of Coherent mentioned in this article. Yes, we all went to University of Waterloo before heading to Chicago to write Coherent. As well, prior to this, when I was working at the Computer Communications Networks Group there, I met Andy Tanenbaum in one of his sojourns. And, I recall him being a really great guy.

Regarding our timeline to develop a kernel, the three of us divvied up the work. Dave Conroy, who had previously written the Decus C compiler (and is now a senior hardware designer at Apple) focused on compiler work as well as a number of other major tasks. I wrote the kernel, many of the libraries and a number of utilities. Johan George did library and utility work. We started in January 1980 and by December 2000 moved from cross-compiling to native builds. This would put the kernel development time at 10+ months. At January 2001 Usenix in San Francisco, I recall Steve Bourne and Dennis Ritchie hacking around with the system and, in particular, analyzing the interrupt latency in the kernel.

Later, when Western Electric lawyers sent Dennis Ritchie up to check on the code he quickly determined it was indeed an original creation. And, before leaving, he commented that the team at Coherent showed "amazing programmer productivity." We were certainly very pleased.

We had a number of innovations such as hot-loadable drivers which I'm not sure Windows ever achieved. It was a lot of fun, but also a lot of years ago.

Chris Laffra 11/03/04 09:51:30 PM EST

Having been a student in a couple of Andy's classes in the mid-eighties, when he was busily working on Minix, I feel both priviliged and honered to have done my studies at the Vrije Universiteit in Amsterdam. I still use many of the lessons taught by this kind, very smart, and annoyingly productive individual. [What mere mortal can be a full-time professor and still write an operating system and 14 books, and find time to manage an electoral voting web site?]

I must admit though that I had my private suspicions that Andy must have felt some resentment to Linux's success, and Minix's "lack" of it. I am happy and reassured to find out the opposite.

I advise everyone to follow up Andy's advise and do their Master's at the Free University, which is not entirely free, of course.

Chris Laffra
Ottawa, Canada

npd 06/01/04 07:08:31 PM EDT

Mr. Tanenbaum,
It is great to see a living legend pass the torch over and over to your students. I only wish I could be so lucky. You do it selflessly and indirectly are probably part of the inspiration which gave birth to linux, and many other things. It is obvious, Linus got what you were teaching, and told himself "I can do that." and did it. In some cases he may have said, "I can do that better" and did that too. You helped teach him how to figure this stuff out. Now I understand why you teach.

As you said, if you had "gotten caught up in" producing your own kernel, and given up academia, at least a small portion of the great things that are happening right now, would not be.

For this, as I am sure your students would be honored to do, I salute you. I feel like I know where you are coming from, even though I don't know you.


Louis HR Muller 05/24/04 11:10:30 PM EDT

Andrew Tanenbaum has demonstrated that he is a man of character. Had he desired, he could have seized the opportunity to add fuel to the fire by indirectly attacking Linus through Brown. Instead, he supported Linus and has earned the respect of many of the Linux community. As a result, his Minix will receive more of a recognition as an inspiration for Linus and a useful educational tool for all those who did learn more about Unix by using Minix. A reading of the early e-mails put forth by Linus clearly indicate he did not write Linux because he harbored any vengence towards Andrew Tanebaum but rather because creating Linux presented an intellectual challenge for him and the others that became involved in the Linux project. Therefore, indirectly, Mr Brown did a service to both Linus and Andy and strengthened their images.

Lion Kuntz 05/24/04 08:03:11 AM EDT

I updated a number of pages on Disinfopedia wiki website to document the culpability of Alexis de SMOKEville's sordid history as a tobacco industry shill. These are some of the new or revised pages, followed by some quotations that search engines will draw upon for results pages. If you post these links in your blogs, you can be pretty sure that every search engine will rank the tobacco connection higher than the FUD pages they post. Over 80% of Americans have quit or never started smoking, so the fans of tobacco shills are always a minority.


From http://www.disinfopedia.org/wiki.phtml?title=S._Fred_Singer
S. Fred Singer
"The "de SMOKEville" junk-science that Singer attached his name to was crowded by hired guns wearing lab coats to continue the decades-long disinformation campaign: Academic Advisory Board -- Dr. Nancy Bord, Hoover Institution; Michael Darby, John M. Olin Center for Policy; Michael Gough, Congressional Office of Technology Assessment; Thomas Gale Moore, Hoover Institution; S. Fred Singer, President Science and Environmental Policy Project; Robert D. Tollison, George Mason University, Richard Wagner, George Mason University. [1] I suppose these include the "seven out of ten doctors who prefer Chesterfields" from the ads of yesteryear."

The same John M. Olin Foundation funds John M. Olin Center for Policy as funds Alexis de Tocqueville Institution. Olin, Scaife and Koch foundations fund the entire list above, apart from the Congressional Office of Technology Assessment which is funded through campaign contributions instead of foundations. Singer, Tollison and Wagner were all from the George Mason University, favorite charities of right-wing donors and energy billionaires Koch and Scaife."


Search for Robert D. Tollison
Search Results
32523 document(s) match your query for Robert D. Tollison .

32,523 mentions in the Tobacco Institute files ordered online in a court settlement. That's pretty good, but no cigar...

Search for S. Fred Singer
Search Results
49778 document(s) match your query for S. Fred Singer

Even though Robert D. Tollison wrote the book on how good for you second-hand smoke is, S. Fred Singer has won the race for covert cash from the disinformation lung polluters of tobacco AND oil companies. Singer helped write the OTHER book that Tollison was only "technical advisor" on, published by Alexis de SMOKEville Institution, er, Alexis de Tocqueville Institution.

In 1994 Cesar Conda was executive director of the Alexis de Tocqueville Institution listed as "Senior Staff and Contributing Associates" on a Lorillard Tobacco Company paid-for publication titled "Science, Economics, and Environmental Policy" by author Kent Jeffreys. [1] Principal Reviewer was listed as S. Fred Singer, and to give this propagandistic tract a sheen of scientific appearance, a loaded gang of "experts" from assorted tobacco-funded front organizations with impressive names was listed: SEPP, Hoover Institution, John M. Olin Center for Policy, George Mason University.

As Executive Director of Alexis de Tocqueville Institution, Conda had more than a casual association with the production of this deception piece. SEPP was certainly known to him, as an article the same year in Commonsense (Fall 1994) "The New Populism: The Rise of the Property Rights Movement," article by Cesar Conda and Mark LaRochelle, mentions SEPP. [2] Kent Jeffreys bonafides would also be known to him. Jeffreys at the time was listed as environmental studies director [3] for Competitive Enterprise Institute [4], an organization with close ties to Alexis de Tocqueville.

Ze Plot Thickens 05/24/04 07:52:05 AM EDT

This has been posted at the alt.os newsgroup, from Justin Orndorff ([email protected]) of the AdTI:


I'm conducting some research on behalf of the Alexis de Tocqueville
Institution in Washington, DC. I'd like if someone could shed some
light on the following questions:

1. Describe the components of an operating system, besides the central
component, the kernel.
2. What do programmers usually develop first, the compiler or the
3. Does this sequence impact the OS at all?
4. What's more complicated, the kernel or the compiler?
5. Why does operating system development take as long as it does? What
are the three key things in operating system development that take the
longest to perfect?
6. Do you need operating systems familiarity to write a kernel? Yes /
no? Elaborate please.
7. In your opinion, why aren't there more operating systems on the

Thanks for your time. Best,
Justin Orndorff

aNoN 05/24/04 07:46:21 AM EDT

"Hidden agenda" being amsterdam's coffeeshops and whorehouses, no doubt.

Not that theres anything WRONG with that. Im just saying that any excuse to go to amsterdam is a good one.

What a great city.

madprof 05/24/04 07:44:09 AM EDT

Poor old Ken Brown must be wondering how wise it was to have made that particular trip now!
Curious that someone would spend all that cash and yet have done so little research. Smells of hidden agendas, or no-so-hidden agendas perhaps?

Minna Kirai 05/24/04 07:42:59 AM EDT

That Tanenbaum is still antagonistic to Linus's system gives him even more credibility. If a friend vouches for you, that might be discounted as a buddy covering for you- but if an enemy says you're innocent, then he's got no motivation to lie on your behalf.

@ThingsExpo Stories
There is only one world-class Cloud event on earth, and that is Cloud Expo – which returns to Silicon Valley for the 21st Cloud Expo at the Santa Clara Convention Center, October 31 - November 2, 2017. Every Global 2000 enterprise in the world is now integrating cloud computing in some form into its IT development and operations. Midsize and small businesses are also migrating to the cloud in increasing numbers. Companies are each developing their unique mix of cloud technologies and service...
WebRTC is great technology to build your own communication tools. It will be even more exciting experience it with advanced devices, such as a 360 Camera, 360 microphone, and a depth sensor camera. In his session at @ThingsExpo, Masashi Ganeko, a manager at INFOCOM Corporation, will introduce two experimental projects from his team and what they learned from them. "Shotoku Tamago" uses the robot audition software HARK to track speakers in 360 video of a remote party. "Virtual Teleport" uses a...
SYS-CON Events announced today that Secure Channels, a cybersecurity firm, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Secure Channels, Inc. offers several products and solutions to its many clients, helping them protect critical data from being compromised and access to computer networks from the unauthorized. The company develops comprehensive data encryption security strategie...
Recently, WebRTC has a lot of eyes from market. The use cases of WebRTC are expanding - video chat, online education, online health care etc. Not only for human-to-human communication, but also IoT use cases such as machine to human use cases can be seen recently. One of the typical use-case is remote camera monitoring. With WebRTC, people can have interoperability and flexibility for deploying monitoring service. However, the benefit of WebRTC for IoT is not only its convenience and interopera...
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
SYS-CON Events announced today that App2Cloud will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct. 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. App2Cloud is an online Platform, specializing in migrating legacy applications to any Cloud Providers (AWS, Azure, Google Cloud).
IoT is at the core or many Digital Transformation initiatives with the goal of re-inventing a company's business model. We all agree that collecting relevant IoT data will result in massive amounts of data needing to be stored. However, with the rapid development of IoT devices and ongoing business model transformation, we are not able to predict the volume and growth of IoT data. And with the lack of IoT history, traditional methods of IT and infrastructure planning based on the past do not app...
Internet-of-Things discussions can end up either going down the consumer gadget rabbit hole or focused on the sort of data logging that industrial manufacturers have been doing forever. However, in fact, companies today are already using IoT data both to optimize their operational technology and to improve the experience of customer interactions in novel ways. In his session at @ThingsExpo, Gordon Haff, Red Hat Technology Evangelist, shared examples from a wide range of industries – including en...
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. Jack Norris reviews best practices to show how companies develop, deploy, and dynamically update these applications and how this data-first...
Intelligent Automation is now one of the key business imperatives for CIOs and CISOs impacting all areas of business today. In his session at 21st Cloud Expo, Brian Boeggeman, VP Alliances & Partnerships at Ayehu, will talk about how business value is created and delivered through intelligent automation to today’s enterprises. The open ecosystem platform approach toward Intelligent Automation that Ayehu delivers to the market is core to enabling the creation of the self-driving enterprise.
Consumers increasingly expect their electronic "things" to be connected to smart phones, tablets and the Internet. When that thing happens to be a medical device, the risks and benefits of connectivity must be carefully weighed. Once the decision is made that connecting the device is beneficial, medical device manufacturers must design their products to maintain patient safety and prevent compromised personal health information in the face of cybersecurity threats. In his session at @ThingsExpo...
"We're a cybersecurity firm that specializes in engineering security solutions both at the software and hardware level. Security cannot be an after-the-fact afterthought, which is what it's become," stated Richard Blech, Chief Executive Officer at Secure Channels, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Events announced today that Grape Up will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct. 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Grape Up is a software company specializing in cloud native application development and professional services related to Cloud Foundry PaaS. With five expert teams that operate in various sectors of the market across the U.S. and Europe, Grape Up works with a variety of customers from emergi...
Detecting internal user threats in the Big Data eco-system is challenging and cumbersome. Many organizations monitor internal usage of the Big Data eco-system using a set of alerts. This is not a scalable process given the increase in the number of alerts with the accelerating growth in data volume and user base. Organizations are increasingly leveraging machine learning to monitor only those data elements that are sensitive and critical, autonomously establish monitoring policies, and to detect...
SYS-CON Events announced today that Massive Networks will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Massive Networks mission is simple. To help your business operate seamlessly with fast, reliable, and secure internet and network solutions. Improve your customer's experience with outstanding connections to your cloud.
Everything run by electricity will eventually be connected to the Internet. Get ahead of the Internet of Things revolution and join Akvelon expert and IoT industry leader, Sergey Grebnov, in his session at @ThingsExpo, for an educational dive into the world of managing your home, workplace and all the devices they contain with the power of machine-based AI and intelligent Bot services for a completely streamlined experience.
Because IoT devices are deployed in mission-critical environments more than ever before, it’s increasingly imperative they be truly smart. IoT sensors simply stockpiling data isn’t useful. IoT must be artificially and naturally intelligent in order to provide more value In his session at @ThingsExpo, John Crupi, Vice President and Engineering System Architect at Greenwave Systems, will discuss how IoT artificial intelligence (AI) can be carried out via edge analytics and machine learning techn...
With tough new regulations coming to Europe on data privacy in May 2018, Calligo will explain why in reality the effect is global and transforms how you consider critical data. EU GDPR fundamentally rewrites the rules for cloud, Big Data and IoT. In his session at 21st Cloud Expo, Adam Ryan, Vice President and General Manager EMEA at Calligo, will examine the regulations and provide insight on how it affects technology, challenges the established rules and will usher in new levels of diligence a...
In the enterprise today, connected IoT devices are everywhere – both inside and outside corporate environments. The need to identify, manage, control and secure a quickly growing web of connections and outside devices is making the already challenging task of security even more important, and onerous. In his session at @ThingsExpo, Rich Boyer, CISO and Chief Architect for Security at NTT i3, discussed new ways of thinking and the approaches needed to address the emerging challenges of security i...
SYS-CON Events announced today that Datera, that offers a radically new data management architecture, has been named "Exhibitor" of SYS-CON's 21st International Cloud Expo ®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Datera is transforming the traditional datacenter model through modern cloud simplicity. The technology industry is at another major inflection point. The rise of mobile, the Internet of Things, data storage and Big...