|By Andrew S. Tanenbaum||
|May 24, 2004 12:00 AM EDT||
BackgroundOn 20 May 2004, I posted a statement refuting the claim of Ken Brown, President of the Alexis de Tocqueville Institution, that Linus Torvalds didn't write Linux. My statement was mentioned on Slashdot, Groklaw, and many other Internet news sites. This attention resulted in over 150,000 requests to our server in less than a day, which is still standing despite yesterday being a national holiday with no one there to stand next to it saying "You can do it. You can do it." Kudos to Sun Microsystems and the folks who built Apache. My statement was mirrored all over the Internet, so the number of true hits to it is probably a substantial multiple of that. There were also quite a few comments at Slashdot, Groklaw, and other sites, many of them about me. I had never engaged in remote multishrink psychoanalysis on this scale before, so it was a fascinating experience.
The Brown Book
I got an advance copy of Ken Brown's book. I think it is still under embargo, so I won't comment on it. Although I am not an investigative reporter, even I know it is unethical to discuss publications still under embargo. Some of us take ethics more seriously than others. So I won't even reveal the title. Let's call it The Brown Book. There is some precedent for nicknaming books after colors: The International Standard for the audio CD (IS 10149) is usually called The Red Book. The CD-ROM was described in the Yellow Book. Suffice it to say, there is a great deal to criticize in the book. I am sure that will happen when it is published. I may even help out.
What prompted me to write this note today is an e-mail I got yesterday. Actually, I got quite a few :-) , most of them thanking me for the historical material. One of yesterday's e-mails was from Linus, in response to an e-mail from me apologizing for not letting him see my statement in advance. As a matter of courtesy, I did try but I was using his old transmeta.com address and didn't know his new one until I got a very kind email from Linus' father, a Finnish journalist.
In his e-mail, Linus said that Brown never contacted him. No e-mail, no phone call, no personal interview. Nothing. Considering the fact that Brown was writing an explosive book in which he accused Linus of not being the author of Linux, you would think a serious author would at least confront the subject with the accusation and give him a chance to respond. What kind of a reporter talks to people on the periphery of the subject but fails to talk to the main player?
Why did Brown fly all the way to Europe to interview me and (and according to an e-mail I got from his seat-mate on the plane) one other person in Scandinavia, at considerable expense, and not at least call Linus? Even if he made a really bad choice of phone company, how much could that cost? Maybe a dollar? I call the U.S. all the time from Amsterdam. It is less than 5 cents a minute. How much could it cost to call California from D.C.?
From reading all the comments posted yesterday, I am now beginning to get the picture. Apparently a lot of people (still) think that I 'hate' Linus for stealing all my glory (see below for more on this). I didn't realize this view was so widespread. I now suspect that Brown believed this, too, and thought that I would be happy to dump all over Linus to get 'revenge.' By flying to Amsterdam he thought he could dig up dirt on Linus and get me to speak evil of him. He thought I would back up his crazy claim that Linus stole Linux from me. Brown was wrong on two counts. First, I bear no 'grudge' against Linus at all. He wrote Linux himself and deserves the credit. Second, I am really not a mean person. Even if I were still angry with him after all these years, I wouldn't choose some sleazy author with a hidden agenda as my vehicle. My home page gets 2500 hits a week. If I had something to say, I could put it there.
When The Brown Book comes out, there will no doubt be a lot of publicity in the mainstream media. Any of you with contacts in the media are actively encouraged to point reporters to this page and my original statement to provide some balance. I really think Brown's motivation should come under scrutiny. I don't believe for a nanosecond that Brown was trying to do a legitimate study of IP and open source or anything like that. I think he was trying to make the case the people funding him (which he refused to disclose to me despite my asking point blank) wanted to have made. Having an institution with an illustrious-sounding name make the case looks better than having an interested party make the case.
Clearing Up Some Misconceptions
I would like to close by clearing up a few misconceptions and also correcting a couple of errors. First, I REALLY am not angry with Linus. HONEST. He's not angry with me either. I am not some kind of "sore loser" who feels he has been eclipsed by Linus. MINIX was only a kind of fun hobby for me. I am a professor. I teach and do research and write books and go to conferences and do things professors do. I like my job and my students and my university. If you want to get a masters there, see my home page for information. I wrote MINIX because I wanted my students to have hands-on experience playing with an operating system. After AT&T forbade teaching from John Lions book, I decided to write a UNIX-like system for my students to play with. Since I had already written two books at this point, one on computer architecture and one on computer networks, it seemed reasonable to describe the system in a new book on operating systems, which is what I did. I was not trying to replace GNU/HURD or Berkeley UNIX. Heaven knows, I have said this enough times. I just wanted to show my students and other students how you could write a UNIX-like system using modern technology. A lot of other people wanted a free production UNIX with lots of bells and whistles and wanted to convert MINIX into that. I was dragged along in the maelstrom for a while, but when Linux came along, I was actually relieved that I could go back to professoring. I never really applied for the position of King of the Hackers and didn't want the job when it was offered. Linus seems to be doing excellent work and I wish him much success in the future.
While writing MINIX was fun, I don't really regard it as the most important thing I have ever done. It was more of a distraction than anything else. The most important thing I have done is produce a number of incredibly good students, especially Ph.D. students. See my home page for the list. They have done great things. I am as proud as a mother hen. To the extent that Linus can be counted as my student, I'm proud of him, too. Professors like it when their students go on to greater glory. I have also written over 100 published research papers and 14 books which have been translated into about 20 languages. As a result I have become a Fellow of the IEEE, a Fellow of the ACM, and won numerous other awards. For me, these are the things that really count. If MINIX had become a big 'commercial' success I wouldn't have had the time to do all this academic stuff that I am actually more interested in.
I can't resist saying a few words about microkernels. A microkernel is a very small kernel. If the file system runs inside the kernel, it is NOT a microkernel. The microkernel should handle low-level process management, scheduling, interprocess communication, interrupt handling, and the basics of memory management and little else. The core microkernel of MINIX 1.0 was under 1400 lines of C and assembler. To that you have to add the headers and device drivers, but the totality of everything that ran in kernel mode was under 5000 lines. Microsoft claimed that Windows NT 3.51 was a microkernel. It wasn't. It wasn't even close. Even they dropped the claim with NT 4.0. Some microkernels have been quite successful, such as QNX and L4. I can't for the life of me see why people object to the 20% performance hit a microkernel might give you when they program in languages like Java and Perl where you often get a factor 20x performance hit. What's the big deal about turning a 3.0 GHz PC into a 2.4 GHz PC due to a microkernel? Surely you once bought a machine appreciably slower than 2.4 GHz and were very happy with it. I would easily give up 20% in performance for a system that was robust, reliable, and wasn't susceptible to many of the ills we see in today's massive operating systems.
I would now like to correct an error in my original statement. One of the e-mails I got yesterday clarified the origins of Coherent. It was not written by Bob Swartz. He was CEO of the Mark Williams Company. Three ex-students from the University of Waterloo, Dave Conroy, Randall Howard, and Johann George, did most of the work. Waterloo is in Canada, where they also play baseball I am told, but only after the ice melts and they can't play hockey. It took the Waterloo students something like 6 man-years to produce Coherent, but this included the kernel, the C compiler, the shell, and ALL the utilities. The kernel is only a tiny fraction of the total code, so it may well be that the kernel itself took a man year. It took me three years to write MINIX, but I was only working at it only in the evenings, and I also wrote 400 pages of text describing the code in that time period (also in the evenings). I think a good programmer can write a 12,000 line kernel in a year.
If you have made it this far, thank you for your time.
Andy Tanenbaum, 21 May 2004
|Randall Howard 08/16/08 10:15:19 PM EDT|
Oops! December 2000 should read "December 1980" - can't figure out how to edit after posting
|Randall Howard 08/16/08 10:10:21 PM EDT|
Although this article is now four years old, I just stumbled on it doing a different search. I'm one of the three writers of Coherent mentioned in this article. Yes, we all went to University of Waterloo before heading to Chicago to write Coherent. As well, prior to this, when I was working at the Computer Communications Networks Group there, I met Andy Tanenbaum in one of his sojourns. And, I recall him being a really great guy.
Regarding our timeline to develop a kernel, the three of us divvied up the work. Dave Conroy, who had previously written the Decus C compiler (and is now a senior hardware designer at Apple) focused on compiler work as well as a number of other major tasks. I wrote the kernel, many of the libraries and a number of utilities. Johan George did library and utility work. We started in January 1980 and by December 2000 moved from cross-compiling to native builds. This would put the kernel development time at 10+ months. At January 2001 Usenix in San Francisco, I recall Steve Bourne and Dennis Ritchie hacking around with the system and, in particular, analyzing the interrupt latency in the kernel.
Later, when Western Electric lawyers sent Dennis Ritchie up to check on the code he quickly determined it was indeed an original creation. And, before leaving, he commented that the team at Coherent showed "amazing programmer productivity." We were certainly very pleased.
We had a number of innovations such as hot-loadable drivers which I'm not sure Windows ever achieved. It was a lot of fun, but also a lot of years ago.
|Chris Laffra 11/03/04 09:51:30 PM EST|
Having been a student in a couple of Andy's classes in the mid-eighties, when he was busily working on Minix, I feel both priviliged and honered to have done my studies at the Vrije Universiteit in Amsterdam. I still use many of the lessons taught by this kind, very smart, and annoyingly productive individual. [What mere mortal can be a full-time professor and still write an operating system and 14 books, and find time to manage an electoral voting web site?]
I must admit though that I had my private suspicions that Andy must have felt some resentment to Linux's success, and Minix's "lack" of it. I am happy and reassured to find out the opposite.
I advise everyone to follow up Andy's advise and do their Master's at the Free University, which is not entirely free, of course.
|npd 06/01/04 07:08:31 PM EDT|
As you said, if you had "gotten caught up in" producing your own kernel, and given up academia, at least a small portion of the great things that are happening right now, would not be.
For this, as I am sure your students would be honored to do, I salute you. I feel like I know where you are coming from, even though I don't know you.
|Louis HR Muller 05/24/04 11:10:30 PM EDT|
Andrew Tanenbaum has demonstrated that he is a man of character. Had he desired, he could have seized the opportunity to add fuel to the fire by indirectly attacking Linus through Brown. Instead, he supported Linus and has earned the respect of many of the Linux community. As a result, his Minix will receive more of a recognition as an inspiration for Linus and a useful educational tool for all those who did learn more about Unix by using Minix. A reading of the early e-mails put forth by Linus clearly indicate he did not write Linux because he harbored any vengence towards Andrew Tanebaum but rather because creating Linux presented an intellectual challenge for him and the others that became involved in the Linux project. Therefore, indirectly, Mr Brown did a service to both Linus and Andy and strengthened their images.
|Lion Kuntz 05/24/04 08:03:11 AM EDT|
I updated a number of pages on Disinfopedia wiki website to document the culpability of Alexis de SMOKEville's sordid history as a tobacco industry shill. These are some of the new or revised pages, followed by some quotations that search engines will draw upon for results pages. If you post these links in your blogs, you can be pretty sure that every search engine will rank the tobacco connection higher than the FUD pages they post. Over 80% of Americans have quit or never started smoking, so the fans of tobacco shills are always a minority.
The same John M. Olin Foundation funds John M. Olin Center for Policy as funds Alexis de Tocqueville Institution. Olin, Scaife and Koch foundations fund the entire list above, apart from the Congressional Office of Technology Assessment which is funded through campaign contributions instead of foundations. Singer, Tollison and Wagner were all from the George Mason University, favorite charities of right-wing donors and energy billionaires Koch and Scaife."
Search for Robert D. Tollison
32,523 mentions in the Tobacco Institute files ordered online in a court settlement. That's pretty good, but no cigar...
Search for S. Fred Singer
Even though Robert D. Tollison wrote the book on how good for you second-hand smoke is, S. Fred Singer has won the race for covert cash from the disinformation lung polluters of tobacco AND oil companies. Singer helped write the OTHER book that Tollison was only "technical advisor" on, published by Alexis de SMOKEville Institution, er, Alexis de Tocqueville Institution.
In 1994 Cesar Conda was executive director of the Alexis de Tocqueville Institution listed as "Senior Staff and Contributing Associates" on a Lorillard Tobacco Company paid-for publication titled "Science, Economics, and Environmental Policy" by author Kent Jeffreys.  Principal Reviewer was listed as S. Fred Singer, and to give this propagandistic tract a sheen of scientific appearance, a loaded gang of "experts" from assorted tobacco-funded front organizations with impressive names was listed: SEPP, Hoover Institution, John M. Olin Center for Policy, George Mason University.
As Executive Director of Alexis de Tocqueville Institution, Conda had more than a casual association with the production of this deception piece. SEPP was certainly known to him, as an article the same year in Commonsense (Fall 1994) "The New Populism: The Rise of the Property Rights Movement," article by Cesar Conda and Mark LaRochelle, mentions SEPP.  Kent Jeffreys bonafides would also be known to him. Jeffreys at the time was listed as environmental studies director  for Competitive Enterprise Institute , an organization with close ties to Alexis de Tocqueville.
|Ze Plot Thickens 05/24/04 07:52:05 AM EDT|
This has been posted at the alt.os newsgroup, from Justin Orndorff ([email protected]) of the AdTI:
I'm conducting some research on behalf of the Alexis de Tocqueville
1. Describe the components of an operating system, besides the central
Thanks for your time. Best,
|aNoN 05/24/04 07:46:21 AM EDT|
"Hidden agenda" being amsterdam's coffeeshops and whorehouses, no doubt.
Not that theres anything WRONG with that. Im just saying that any excuse to go to amsterdam is a good one.
What a great city.
|madprof 05/24/04 07:44:09 AM EDT|
Poor old Ken Brown must be wondering how wise it was to have made that particular trip now!
|Minna Kirai 05/24/04 07:42:59 AM EDT|
That Tanenbaum is still antagonistic to Linus's system gives him even more credibility. If a friend vouches for you, that might be discounted as a buddy covering for you- but if an enemy says you're innocent, then he's got no motivation to lie on your behalf.
SYS-CON Events announced today that 910Telecom will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Housed in the classic Denver Gas & Electric Building, 910 15th St., 910Telecom is a carrier-neutral telecom hotel located in the heart of Denver. Adjacent to CenturyLink, AT&T, and Denver Main, 910Telecom offers connectivity to all major carriers, Internet service providers, Internet backbones and ...
Aug. 29, 2016 07:00 PM EDT Reads: 1,974
Almost two-thirds of companies either have or soon will have IoT as the backbone of their business in 2016. However, IoT is far more complex than most firms expected. How can you not get trapped in the pitfalls? In his session at @ThingsExpo, Tony Shan, a renowned visionary and thought leader, will introduce a holistic method of IoTification, which is the process of IoTifying the existing technology and business models to adopt and leverage IoT. He will drill down to the components in this fra...
Aug. 29, 2016 06:15 PM EDT Reads: 275
Why do your mobile transformations need to happen today? Mobile is the strategy that enterprise transformation centers on to drive customer engagement. In his general session at @ThingsExpo, Roger Woods, Director, Mobile Product & Strategy – Adobe Marketing Cloud, covered key IoT and mobile trends that are forcing mobile transformation, key components of a solid mobile strategy and explored how brands are effectively driving mobile change throughout the enterprise.
Aug. 29, 2016 05:03 PM EDT Reads: 206
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devices - comp...
Aug. 29, 2016 02:15 PM EDT Reads: 3,738
DevOps at Cloud Expo, taking place Nov 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long dev...
Aug. 29, 2016 02:00 PM EDT Reads: 2,476
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
Aug. 29, 2016 12:45 PM EDT Reads: 2,024
There is growing need for data-driven applications and the need for digital platforms to build these apps. In his session at 19th Cloud Expo, Muddu Sudhakar, VP and GM of Security & IoT at Splunk, will cover different PaaS solutions and Big Data platforms that are available to build applications. In addition, AI and machine learning are creating new requirements that developers need in the building of next-gen apps. The next-generation digital platforms have some of the past platform needs a...
Aug. 29, 2016 12:15 PM EDT Reads: 825
19th Cloud Expo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterpri...
Aug. 29, 2016 12:00 PM EDT Reads: 3,193
SYS-CON Events announced today Telecom Reseller has been named “Media Sponsor” of SYS-CON's 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.
Aug. 29, 2016 08:00 AM EDT Reads: 946
Pulzze Systems was happy to participate in such a premier event and thankful to be receiving the winning investment and global network support from G-Startup Worldwide. It is an exciting time for Pulzze to showcase the effectiveness of innovative technologies and enable them to make the world smarter and better. The reputable contest is held to identify promising startups around the globe that are assured to change the world through their innovative products and disruptive technologies. There w...
Aug. 29, 2016 07:30 AM EDT Reads: 820
With so much going on in this space you could be forgiven for thinking you were always working with yesterday’s technologies. So much change, so quickly. What do you do if you have to build a solution from the ground up that is expected to live in the field for at least 5-10 years? This is the challenge we faced when we looked to refresh our existing 10-year-old custom hardware stack to measure the fullness of trash cans and compactors.
Aug. 29, 2016 02:15 AM EDT Reads: 1,842
The emerging Internet of Everything creates tremendous new opportunities for customer engagement and business model innovation. However, enterprises must overcome a number of critical challenges to bring these new solutions to market. In his session at @ThingsExpo, Michael Martin, CTO/CIO at nfrastructure, outlined these key challenges and recommended approaches for overcoming them to achieve speed and agility in the design, development and implementation of Internet of Everything solutions wi...
Aug. 29, 2016 01:45 AM EDT Reads: 2,202
Cloud computing is being adopted in one form or another by 94% of enterprises today. Tens of billions of new devices are being connected to The Internet of Things. And Big Data is driving this bus. An exponential increase is expected in the amount of information being processed, managed, analyzed, and acted upon by enterprise IT. This amazing is not part of some distant future - it is happening today. One report shows a 650% increase in enterprise data by 2020. Other estimates are even higher....
Aug. 29, 2016 01:15 AM EDT Reads: 3,034
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
Aug. 29, 2016 12:00 AM EDT Reads: 1,924
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
Aug. 28, 2016 10:30 PM EDT Reads: 4,077
Smart Cities are here to stay, but for their promise to be delivered, the data they produce must not be put in new siloes. In his session at @ThingsExpo, Mathias Herberts, Co-founder and CTO of Cityzen Data, will deep dive into best practices that will ensure a successful smart city journey.
Aug. 28, 2016 06:30 PM EDT Reads: 1,653
Identity is in everything and customers are looking to their providers to ensure the security of their identities, transactions and data. With the increased reliance on cloud-based services, service providers must build security and trust into their offerings, adding value to customers and improving the user experience. Making identity, security and privacy easy for customers provides a unique advantage over the competition.
Aug. 27, 2016 08:45 PM EDT Reads: 2,433
I wanted to gather all of my Internet of Things (IOT) blogs into a single blog (that I could later use with my University of San Francisco (USF) Big Data “MBA” course). However as I started to pull these blogs together, I realized that my IOT discussion lacked a vision; it lacked an end point towards which an organization could drive their IOT envisioning, proof of value, app dev, data engineering and data science efforts. And I think that the IOT end point is really quite simple…
Aug. 27, 2016 12:45 PM EDT Reads: 2,405
Personalization has long been the holy grail of marketing. Simply stated, communicate the most relevant offer to the right person and you will increase sales. To achieve this, you must understand the individual. Consequently, digital marketers developed many ways to gather and leverage customer information to deliver targeted experiences. In his session at @ThingsExpo, Lou Casal, Founder and Principal Consultant at Practicala, discussed how the Internet of Things (IoT) has accelerated our abil...
Aug. 27, 2016 02:30 AM EDT Reads: 2,097
Is the ongoing quest for agility in the data center forcing you to evaluate how to be a part of infrastructure automation efforts? As organizations evolve toward bimodal IT operations, they are embracing new service delivery models and leveraging virtualization to increase infrastructure agility. Therefore, the network must evolve in parallel to become equally agile. Read this essential piece of Gartner research for recommendations on achieving greater agility.
Aug. 25, 2016 05:15 PM EDT Reads: 925