Click here to close now.


Linux Containers Authors: Carmen Gonzalez, AppDynamics Blog, Pat Romanski, Elizabeth White, Liz McMillan

Related Topics: Linux Containers

Linux Containers: Article

"Father of MINIX," Andy Tanenbaum, on Kenneth Brown's Claims Re Linux

Ken Brown's Motivation, Release 2.1


On 20 May 2004, I posted a statement refuting the claim of Ken Brown, President of the Alexis de Tocqueville Institution, that Linus Torvalds didn't write Linux. My statement was mentioned on Slashdot, Groklaw, and many other Internet news sites. This attention resulted in over 150,000 requests to our server in less than a day, which is still standing despite yesterday being a national holiday with no one there to stand next to it saying "You can do it. You can do it." Kudos to Sun Microsystems and the folks who built Apache. My statement was mirrored all over the Internet, so the number of true hits to it is probably a substantial multiple of that. There were also quite a few comments at Slashdot, Groklaw, and other sites, many of them about me. I had never engaged in remote multishrink psychoanalysis on this scale before, so it was a fascinating experience.

The Brown Book

I got an advance copy of Ken Brown's book. I think it is still under embargo, so I won't comment on it. Although I am not an investigative reporter, even I know it is unethical to discuss publications still under embargo. Some of us take ethics more seriously than others. So I won't even reveal the title. Let's call it The Brown Book. There is some precedent for nicknaming books after colors: The International Standard for the audio CD (IS 10149) is usually called The Red Book. The CD-ROM was described in the Yellow Book. Suffice it to say, there is a great deal to criticize in the book. I am sure that will happen when it is published. I may even help out.

Brown's Motivation

What prompted me to write this note today is an e-mail I got yesterday. Actually, I got quite a few :-) , most of them thanking me for the historical material. One of yesterday's e-mails was from Linus, in response to an e-mail from me apologizing for not letting him see my statement in advance. As a matter of courtesy, I did try but I was using his old address and didn't know his new one until I got a very kind email from Linus' father, a Finnish journalist.

In his e-mail, Linus said that Brown never contacted him. No e-mail, no phone call, no personal interview. Nothing. Considering the fact that Brown was writing an explosive book in which he accused Linus of not being the author of Linux, you would think a serious author would at least confront the subject with the accusation and give him a chance to respond. What kind of a reporter talks to people on the periphery of the subject but fails to talk to the main player?

Why did Brown fly all the way to Europe to interview me and (and according to an e-mail I got from his seat-mate on the plane) one other person in Scandinavia, at considerable expense, and not at least call Linus? Even if he made a really bad choice of phone company, how much could that cost? Maybe a dollar? I call the U.S. all the time from Amsterdam. It is less than 5 cents a minute. How much could it cost to call California from D.C.?

From reading all the comments posted yesterday, I am now beginning to get the picture. Apparently a lot of people (still) think that I 'hate' Linus for stealing all my glory (see below for more on this). I didn't realize this view was so widespread. I now suspect that Brown believed this, too, and thought that I would be happy to dump all over Linus to get 'revenge.' By flying to Amsterdam he thought he could dig up dirt on Linus and get me to speak evil of him. He thought I would back up his crazy claim that Linus stole Linux from me. Brown was wrong on two counts. First, I bear no 'grudge' against Linus at all. He wrote Linux himself and deserves the credit. Second, I am really not a mean person. Even if I were still angry with him after all these years, I wouldn't choose some sleazy author with a hidden agenda as my vehicle. My home page gets 2500 hits a week. If I had something to say, I could put it there.

When The Brown Book comes out, there will no doubt be a lot of publicity in the mainstream media. Any of you with contacts in the media are actively encouraged to point reporters to this page and my original statement to provide some balance. I really think Brown's motivation should come under scrutiny. I don't believe for a nanosecond that Brown was trying to do a legitimate study of IP and open source or anything like that. I think he was trying to make the case the people funding him (which he refused to disclose to me despite my asking point blank) wanted to have made. Having an institution with an illustrious-sounding name make the case looks better than having an interested party make the case.

Clearing Up Some Misconceptions

I would like to close by clearing up a few misconceptions and also correcting a couple of errors. First, I REALLY am not angry with Linus. HONEST. He's not angry with me either. I am not some kind of "sore loser" who feels he has been eclipsed by Linus. MINIX was only a kind of fun hobby for me. I am a professor. I teach and do research and write books and go to conferences and do things professors do. I like my job and my students and my university. If you want to get a masters there, see my home page for information. I wrote MINIX because I wanted my students to have hands-on experience playing with an operating system. After AT&T forbade teaching from John Lions book, I decided to write a UNIX-like system for my students to play with. Since I had already written two books at this point, one on computer architecture and one on computer networks, it seemed reasonable to describe the system in a new book on operating systems, which is what I did. I was not trying to replace GNU/HURD or Berkeley UNIX. Heaven knows, I have said this enough times. I just wanted to show my students and other students how you could write a UNIX-like system using modern technology. A lot of other people wanted a free production UNIX with lots of bells and whistles and wanted to convert MINIX into that. I was dragged along in the maelstrom for a while, but when Linux came along, I was actually relieved that I could go back to professoring. I never really applied for the position of King of the Hackers and didn't want the job when it was offered. Linus seems to be doing excellent work and I wish him much success in the future.

While writing MINIX was fun, I don't really regard it as the most important thing I have ever done. It was more of a distraction than anything else. The most important thing I have done is produce a number of incredibly good students, especially Ph.D. students. See my home page for the list. They have done great things. I am as proud as a mother hen. To the extent that Linus can be counted as my student, I'm proud of him, too. Professors like it when their students go on to greater glory. I have also written over 100 published research papers and 14 books which have been translated into about 20 languages. As a result I have become a Fellow of the IEEE, a Fellow of the ACM, and won numerous other awards. For me, these are the things that really count. If MINIX had become a big 'commercial' success I wouldn't have had the time to do all this academic stuff that I am actually more interested in.

Microkernels Revisited

I can't resist saying a few words about microkernels. A microkernel is a very small kernel. If the file system runs inside the kernel, it is NOT a microkernel. The microkernel should handle low-level process management, scheduling, interprocess communication, interrupt handling, and the basics of memory management and little else. The core microkernel of MINIX 1.0 was under 1400 lines of C and assembler. To that you have to add the headers and device drivers, but the totality of everything that ran in kernel mode was under 5000 lines. Microsoft claimed that Windows NT 3.51 was a microkernel. It wasn't. It wasn't even close. Even they dropped the claim with NT 4.0. Some microkernels have been quite successful, such as QNX and L4. I can't for the life of me see why people object to the 20% performance hit a microkernel might give you when they program in languages like Java and Perl where you often get a factor 20x performance hit. What's the big deal about turning a 3.0 GHz PC into a 2.4 GHz PC due to a microkernel? Surely you once bought a machine appreciably slower than 2.4 GHz and were very happy with it. I would easily give up 20% in performance for a system that was robust, reliable, and wasn't susceptible to many of the ills we see in today's massive operating systems.


I would now like to correct an error in my original statement. One of the e-mails I got yesterday clarified the origins of Coherent. It was not written by Bob Swartz. He was CEO of the Mark Williams Company. Three ex-students from the University of Waterloo, Dave Conroy, Randall Howard, and Johann George, did most of the work. Waterloo is in Canada, where they also play baseball I am told, but only after the ice melts and they can't play hockey. It took the Waterloo students something like 6 man-years to produce Coherent, but this included the kernel, the C compiler, the shell, and ALL the utilities. The kernel is only a tiny fraction of the total code, so it may well be that the kernel itself took a man year. It took me three years to write MINIX, but I was only working at it only in the evenings, and I also wrote 400 pages of text describing the code in that time period (also in the evenings). I think a good programmer can write a 12,000 line kernel in a year.

If you have made it this far, thank you for your time.

Andy Tanenbaum, 21 May 2004

More Stories By Andrew S. Tanenbaum

Andy Tanenbaum is Professor of Computer Science at Vrije University, in Amsterdam, The Netherlands. He wrote MINIX as an appendix or example in the book "Operating Systems: Design and Implementation." An abridged 12000 lines of source code of the kernel, memory manager, and file system is printed in the book; it is mostly written in C.

Comments (10) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

Most Recent Comments
Randall Howard 08/16/08 10:15:19 PM EDT

Oops! December 2000 should read "December 1980" - can't figure out how to edit after posting

Randall Howard 08/16/08 10:10:21 PM EDT

Although this article is now four years old, I just stumbled on it doing a different search. I'm one of the three writers of Coherent mentioned in this article. Yes, we all went to University of Waterloo before heading to Chicago to write Coherent. As well, prior to this, when I was working at the Computer Communications Networks Group there, I met Andy Tanenbaum in one of his sojourns. And, I recall him being a really great guy.

Regarding our timeline to develop a kernel, the three of us divvied up the work. Dave Conroy, who had previously written the Decus C compiler (and is now a senior hardware designer at Apple) focused on compiler work as well as a number of other major tasks. I wrote the kernel, many of the libraries and a number of utilities. Johan George did library and utility work. We started in January 1980 and by December 2000 moved from cross-compiling to native builds. This would put the kernel development time at 10+ months. At January 2001 Usenix in San Francisco, I recall Steve Bourne and Dennis Ritchie hacking around with the system and, in particular, analyzing the interrupt latency in the kernel.

Later, when Western Electric lawyers sent Dennis Ritchie up to check on the code he quickly determined it was indeed an original creation. And, before leaving, he commented that the team at Coherent showed "amazing programmer productivity." We were certainly very pleased.

We had a number of innovations such as hot-loadable drivers which I'm not sure Windows ever achieved. It was a lot of fun, but also a lot of years ago.

Chris Laffra 11/03/04 09:51:30 PM EST

Having been a student in a couple of Andy's classes in the mid-eighties, when he was busily working on Minix, I feel both priviliged and honered to have done my studies at the Vrije Universiteit in Amsterdam. I still use many of the lessons taught by this kind, very smart, and annoyingly productive individual. [What mere mortal can be a full-time professor and still write an operating system and 14 books, and find time to manage an electoral voting web site?]

I must admit though that I had my private suspicions that Andy must have felt some resentment to Linux's success, and Minix's "lack" of it. I am happy and reassured to find out the opposite.

I advise everyone to follow up Andy's advise and do their Master's at the Free University, which is not entirely free, of course.

Chris Laffra
Ottawa, Canada

npd 06/01/04 07:08:31 PM EDT

Mr. Tanenbaum,
It is great to see a living legend pass the torch over and over to your students. I only wish I could be so lucky. You do it selflessly and indirectly are probably part of the inspiration which gave birth to linux, and many other things. It is obvious, Linus got what you were teaching, and told himself "I can do that." and did it. In some cases he may have said, "I can do that better" and did that too. You helped teach him how to figure this stuff out. Now I understand why you teach.

As you said, if you had "gotten caught up in" producing your own kernel, and given up academia, at least a small portion of the great things that are happening right now, would not be.

For this, as I am sure your students would be honored to do, I salute you. I feel like I know where you are coming from, even though I don't know you.


Louis HR Muller 05/24/04 11:10:30 PM EDT

Andrew Tanenbaum has demonstrated that he is a man of character. Had he desired, he could have seized the opportunity to add fuel to the fire by indirectly attacking Linus through Brown. Instead, he supported Linus and has earned the respect of many of the Linux community. As a result, his Minix will receive more of a recognition as an inspiration for Linus and a useful educational tool for all those who did learn more about Unix by using Minix. A reading of the early e-mails put forth by Linus clearly indicate he did not write Linux because he harbored any vengence towards Andrew Tanebaum but rather because creating Linux presented an intellectual challenge for him and the others that became involved in the Linux project. Therefore, indirectly, Mr Brown did a service to both Linus and Andy and strengthened their images.

Lion Kuntz 05/24/04 08:03:11 AM EDT

I updated a number of pages on Disinfopedia wiki website to document the culpability of Alexis de SMOKEville's sordid history as a tobacco industry shill. These are some of the new or revised pages, followed by some quotations that search engines will draw upon for results pages. If you post these links in your blogs, you can be pretty sure that every search engine will rank the tobacco connection higher than the FUD pages they post. Over 80% of Americans have quit or never started smoking, so the fans of tobacco shills are always a minority.

S. Fred Singer
"The "de SMOKEville" junk-science that Singer attached his name to was crowded by hired guns wearing lab coats to continue the decades-long disinformation campaign: Academic Advisory Board -- Dr. Nancy Bord, Hoover Institution; Michael Darby, John M. Olin Center for Policy; Michael Gough, Congressional Office of Technology Assessment; Thomas Gale Moore, Hoover Institution; S. Fred Singer, President Science and Environmental Policy Project; Robert D. Tollison, George Mason University, Richard Wagner, George Mason University. [1] I suppose these include the "seven out of ten doctors who prefer Chesterfields" from the ads of yesteryear."

The same John M. Olin Foundation funds John M. Olin Center for Policy as funds Alexis de Tocqueville Institution. Olin, Scaife and Koch foundations fund the entire list above, apart from the Congressional Office of Technology Assessment which is funded through campaign contributions instead of foundations. Singer, Tollison and Wagner were all from the George Mason University, favorite charities of right-wing donors and energy billionaires Koch and Scaife."

Search for Robert D. Tollison
Search Results
32523 document(s) match your query for Robert D. Tollison .

32,523 mentions in the Tobacco Institute files ordered online in a court settlement. That's pretty good, but no cigar...

Search for S. Fred Singer
Search Results
49778 document(s) match your query for S. Fred Singer

Even though Robert D. Tollison wrote the book on how good for you second-hand smoke is, S. Fred Singer has won the race for covert cash from the disinformation lung polluters of tobacco AND oil companies. Singer helped write the OTHER book that Tollison was only "technical advisor" on, published by Alexis de SMOKEville Institution, er, Alexis de Tocqueville Institution.

In 1994 Cesar Conda was executive director of the Alexis de Tocqueville Institution listed as "Senior Staff and Contributing Associates" on a Lorillard Tobacco Company paid-for publication titled "Science, Economics, and Environmental Policy" by author Kent Jeffreys. [1] Principal Reviewer was listed as S. Fred Singer, and to give this propagandistic tract a sheen of scientific appearance, a loaded gang of "experts" from assorted tobacco-funded front organizations with impressive names was listed: SEPP, Hoover Institution, John M. Olin Center for Policy, George Mason University.

As Executive Director of Alexis de Tocqueville Institution, Conda had more than a casual association with the production of this deception piece. SEPP was certainly known to him, as an article the same year in Commonsense (Fall 1994) "The New Populism: The Rise of the Property Rights Movement," article by Cesar Conda and Mark LaRochelle, mentions SEPP. [2] Kent Jeffreys bonafides would also be known to him. Jeffreys at the time was listed as environmental studies director [3] for Competitive Enterprise Institute [4], an organization with close ties to Alexis de Tocqueville.

Ze Plot Thickens 05/24/04 07:52:05 AM EDT

This has been posted at the alt.os newsgroup, from Justin Orndorff ([email protected]) of the AdTI:


I'm conducting some research on behalf of the Alexis de Tocqueville
Institution in Washington, DC. I'd like if someone could shed some
light on the following questions:

1. Describe the components of an operating system, besides the central
component, the kernel.
2. What do programmers usually develop first, the compiler or the
3. Does this sequence impact the OS at all?
4. What's more complicated, the kernel or the compiler?
5. Why does operating system development take as long as it does? What
are the three key things in operating system development that take the
longest to perfect?
6. Do you need operating systems familiarity to write a kernel? Yes /
no? Elaborate please.
7. In your opinion, why aren't there more operating systems on the

Thanks for your time. Best,
Justin Orndorff

aNoN 05/24/04 07:46:21 AM EDT

"Hidden agenda" being amsterdam's coffeeshops and whorehouses, no doubt.

Not that theres anything WRONG with that. Im just saying that any excuse to go to amsterdam is a good one.

What a great city.

madprof 05/24/04 07:44:09 AM EDT

Poor old Ken Brown must be wondering how wise it was to have made that particular trip now!
Curious that someone would spend all that cash and yet have done so little research. Smells of hidden agendas, or no-so-hidden agendas perhaps?

Minna Kirai 05/24/04 07:42:59 AM EDT

That Tanenbaum is still antagonistic to Linus's system gives him even more credibility. If a friend vouches for you, that might be discounted as a buddy covering for you- but if an enemy says you're innocent, then he's got no motivation to lie on your behalf.

@ThingsExpo Stories
The buzz continues for cloud, data analytics and the Internet of Things (IoT) and their collective impact across all industries. But a new conversation is emerging - how do companies use industry disruption and technology enablers to lead in markets undergoing change, uncertainty and ambiguity? Organizations of all sizes need to evolve and transform, often under massive pressure, as industry lines blur and merge and traditional business models are assaulted and turned upside down. In this new data-driven world, marketplaces reign supreme while interoperability, APIs and applications deliver un...
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data shows "less than 10 percent of IoT developers are making enough to support a reasonably sized team....
Electric power utilities face relentless pressure on their financial performance, and reducing distribution grid losses is one of the last untapped opportunities to meet their business goals. Combining IoT-enabled sensors and cloud-based data analytics, utilities now are able to find, quantify and reduce losses faster – and with a smaller IT footprint. Solutions exist using Internet-enabled sensors deployed temporarily at strategic locations within the distribution grid to measure actual line loads.
You have your devices and your data, but what about the rest of your Internet of Things story? Two popular classes of technologies that nicely handle the Big Data analytics for Internet of Things are Apache Hadoop and NoSQL. Hadoop is designed for parallelizing analytical work across many servers and is ideal for the massive data volumes you create with IoT devices. NoSQL databases such as Apache HBase are ideal for storing and retrieving IoT data as “time series data.”
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome,” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
Today air travel is a minefield of delays, hassles and customer disappointment. Airlines struggle to revitalize the experience. GE and M2Mi will demonstrate practical examples of how IoT solutions are helping airlines bring back personalization, reduce trip time and improve reliability. In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect with GE, and Dr. Sarah Cooper, M2Mi's VP Business Development and Engineering, will explore the IoT cloud-based platform technologies driving this change including privacy controls, data transparency and integration of real time context w...
The Internet of Everything is re-shaping technology trends–moving away from “request/response” architecture to an “always-on” Streaming Web where data is in constant motion and secure, reliable communication is an absolute necessity. As more and more THINGS go online, the challenges that developers will need to address will only increase exponentially. In his session at @ThingsExpo, Todd Greene, Founder & CEO of PubNub, will explore the current state of IoT connectivity and review key trends and technology requirements that will drive the Internet of Things from hype to reality.
The IoT market is on track to hit $7.1 trillion in 2020. The reality is that only a handful of companies are ready for this massive demand. There are a lot of barriers, paint points, traps, and hidden roadblocks. How can we deal with these issues and challenges? The paradigm has changed. Old-style ad-hoc trial-and-error ways will certainly lead you to the dead end. What is mandatory is an overarching and adaptive approach to effectively handle the rapid changes and exponential growth.
Today’s connected world is moving from devices towards things, what this means is that by using increasingly low cost sensors embedded in devices we can create many new use cases. These span across use cases in cities, vehicles, home, offices, factories, retail environments, worksites, health, logistics, and health. These use cases rely on ubiquitous connectivity and generate massive amounts of data at scale. These technologies enable new business opportunities, ways to optimize and automate, along with new ways to engage with users.
The IoT is upon us, but today’s databases, built on 30-year-old math, require multiple platforms to create a single solution. Data demands of the IoT require Big Data systems that can handle ingest, transactions and analytics concurrently adapting to varied situations as they occur, with speed at scale. In his session at @ThingsExpo, Chad Jones, chief strategy officer at Deep Information Sciences, will look differently at IoT data so enterprises can fully leverage their IoT potential. He’ll share tips on how to speed up business initiatives, harness Big Data and remain one step ahead by apply...
There will be 20 billion IoT devices connected to the Internet soon. What if we could control these devices with our voice, mind, or gestures? What if we could teach these devices how to talk to each other? What if these devices could learn how to interact with us (and each other) to make our lives better? What if Jarvis was real? How can I gain these super powers? In his session at 17th Cloud Expo, Chris Matthieu, co-founder and CTO of Octoblu, will show you!
SYS-CON Events announced today that ProfitBricks, the provider of painless cloud infrastructure, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. ProfitBricks is the IaaS provider that offers a painless cloud experience for all IT users, with no learning curve. ProfitBricks boasts flexible cloud servers and networking, an integrated Data Center Designer tool for visual control over the cloud and the best price/performance value available. ProfitBricks was named one of the coolest Clo...
As a company adopts a DevOps approach to software development, what are key things that both the Dev and Ops side of the business must keep in mind to ensure effective continuous delivery? In his session at DevOps Summit, Mark Hydar, Head of DevOps, Ericsson TV Platforms, will share best practices and provide helpful tips for Ops teams to adopt an open line of communication with the development side of the house to ensure success between the two sides.
SYS-CON Events announced today that IBM Cloud Data Services has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. IBM Cloud Data Services offers a portfolio of integrated, best-of-breed cloud data services for developers focused on mobile computing and analytics use cases.
SYS-CON Events announced today that Sandy Carter, IBM General Manager Cloud Ecosystem and Developers, and a Social Business Evangelist, will keynote at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
Developing software for the Internet of Things (IoT) comes with its own set of challenges. Security, privacy, and unified standards are a few key issues. In addition, each IoT product is comprised of at least three separate application components: the software embedded in the device, the backend big-data service, and the mobile application for the end user's controls. Each component is developed by a different team, using different technologies and practices, and deployed to a different stack/target - this makes the integration of these separate pipelines and the coordination of software upd...
Mobile messaging has been a popular communication channel for more than 20 years. Finnish engineer Matti Makkonen invented the idea for SMS (Short Message Service) in 1984, making his vision a reality on December 3, 1992 by sending the first message ("Happy Christmas") from a PC to a cell phone. Since then, the technology has evolved immensely, from both a technology standpoint, and in our everyday uses for it. Originally used for person-to-person (P2P) communication, i.e., Sally sends a text message to Betty – mobile messaging now offers tremendous value to businesses for customer and empl...
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
WebRTC converts the entire network into a ubiquitous communications cloud thereby connecting anytime, anywhere through any point. In his session at WebRTC Summit,, Mark Castleman, EIR at Bell Labs and Head of Future X Labs, will discuss how the transformational nature of communications is achieved through the democratizing force of WebRTC. WebRTC is doing for voice what HTML did for web content.
Nowadays, a large number of sensors and devices are connected to the network. Leading-edge IoT technologies integrate various types of sensor data to create a new value for several business decision scenarios. The transparent cloud is a model of a new IoT emergence service platform. Many service providers store and access various types of sensor data in order to create and find out new business values by integrating such data.