|By Andrew S. Tanenbaum||
|May 24, 2004 12:00 AM EDT||
BackgroundOn 20 May 2004, I posted a statement refuting the claim of Ken Brown, President of the Alexis de Tocqueville Institution, that Linus Torvalds didn't write Linux. My statement was mentioned on Slashdot, Groklaw, and many other Internet news sites. This attention resulted in over 150,000 requests to our server in less than a day, which is still standing despite yesterday being a national holiday with no one there to stand next to it saying "You can do it. You can do it." Kudos to Sun Microsystems and the folks who built Apache. My statement was mirrored all over the Internet, so the number of true hits to it is probably a substantial multiple of that. There were also quite a few comments at Slashdot, Groklaw, and other sites, many of them about me. I had never engaged in remote multishrink psychoanalysis on this scale before, so it was a fascinating experience.
The Brown Book
I got an advance copy of Ken Brown's book. I think it is still under embargo, so I won't comment on it. Although I am not an investigative reporter, even I know it is unethical to discuss publications still under embargo. Some of us take ethics more seriously than others. So I won't even reveal the title. Let's call it The Brown Book. There is some precedent for nicknaming books after colors: The International Standard for the audio CD (IS 10149) is usually called The Red Book. The CD-ROM was described in the Yellow Book. Suffice it to say, there is a great deal to criticize in the book. I am sure that will happen when it is published. I may even help out.
What prompted me to write this note today is an e-mail I got yesterday. Actually, I got quite a few :-) , most of them thanking me for the historical material. One of yesterday's e-mails was from Linus, in response to an e-mail from me apologizing for not letting him see my statement in advance. As a matter of courtesy, I did try but I was using his old transmeta.com address and didn't know his new one until I got a very kind email from Linus' father, a Finnish journalist.
In his e-mail, Linus said that Brown never contacted him. No e-mail, no phone call, no personal interview. Nothing. Considering the fact that Brown was writing an explosive book in which he accused Linus of not being the author of Linux, you would think a serious author would at least confront the subject with the accusation and give him a chance to respond. What kind of a reporter talks to people on the periphery of the subject but fails to talk to the main player?
Why did Brown fly all the way to Europe to interview me and (and according to an e-mail I got from his seat-mate on the plane) one other person in Scandinavia, at considerable expense, and not at least call Linus? Even if he made a really bad choice of phone company, how much could that cost? Maybe a dollar? I call the U.S. all the time from Amsterdam. It is less than 5 cents a minute. How much could it cost to call California from D.C.?
From reading all the comments posted yesterday, I am now beginning to get the picture. Apparently a lot of people (still) think that I 'hate' Linus for stealing all my glory (see below for more on this). I didn't realize this view was so widespread. I now suspect that Brown believed this, too, and thought that I would be happy to dump all over Linus to get 'revenge.' By flying to Amsterdam he thought he could dig up dirt on Linus and get me to speak evil of him. He thought I would back up his crazy claim that Linus stole Linux from me. Brown was wrong on two counts. First, I bear no 'grudge' against Linus at all. He wrote Linux himself and deserves the credit. Second, I am really not a mean person. Even if I were still angry with him after all these years, I wouldn't choose some sleazy author with a hidden agenda as my vehicle. My home page gets 2500 hits a week. If I had something to say, I could put it there.
When The Brown Book comes out, there will no doubt be a lot of publicity in the mainstream media. Any of you with contacts in the media are actively encouraged to point reporters to this page and my original statement to provide some balance. I really think Brown's motivation should come under scrutiny. I don't believe for a nanosecond that Brown was trying to do a legitimate study of IP and open source or anything like that. I think he was trying to make the case the people funding him (which he refused to disclose to me despite my asking point blank) wanted to have made. Having an institution with an illustrious-sounding name make the case looks better than having an interested party make the case.
Clearing Up Some Misconceptions
I would like to close by clearing up a few misconceptions and also correcting a couple of errors. First, I REALLY am not angry with Linus. HONEST. He's not angry with me either. I am not some kind of "sore loser" who feels he has been eclipsed by Linus. MINIX was only a kind of fun hobby for me. I am a professor. I teach and do research and write books and go to conferences and do things professors do. I like my job and my students and my university. If you want to get a masters there, see my home page for information. I wrote MINIX because I wanted my students to have hands-on experience playing with an operating system. After AT&T forbade teaching from John Lions book, I decided to write a UNIX-like system for my students to play with. Since I had already written two books at this point, one on computer architecture and one on computer networks, it seemed reasonable to describe the system in a new book on operating systems, which is what I did. I was not trying to replace GNU/HURD or Berkeley UNIX. Heaven knows, I have said this enough times. I just wanted to show my students and other students how you could write a UNIX-like system using modern technology. A lot of other people wanted a free production UNIX with lots of bells and whistles and wanted to convert MINIX into that. I was dragged along in the maelstrom for a while, but when Linux came along, I was actually relieved that I could go back to professoring. I never really applied for the position of King of the Hackers and didn't want the job when it was offered. Linus seems to be doing excellent work and I wish him much success in the future.
While writing MINIX was fun, I don't really regard it as the most important thing I have ever done. It was more of a distraction than anything else. The most important thing I have done is produce a number of incredibly good students, especially Ph.D. students. See my home page for the list. They have done great things. I am as proud as a mother hen. To the extent that Linus can be counted as my student, I'm proud of him, too. Professors like it when their students go on to greater glory. I have also written over 100 published research papers and 14 books which have been translated into about 20 languages. As a result I have become a Fellow of the IEEE, a Fellow of the ACM, and won numerous other awards. For me, these are the things that really count. If MINIX had become a big 'commercial' success I wouldn't have had the time to do all this academic stuff that I am actually more interested in.
I can't resist saying a few words about microkernels. A microkernel is a very small kernel. If the file system runs inside the kernel, it is NOT a microkernel. The microkernel should handle low-level process management, scheduling, interprocess communication, interrupt handling, and the basics of memory management and little else. The core microkernel of MINIX 1.0 was under 1400 lines of C and assembler. To that you have to add the headers and device drivers, but the totality of everything that ran in kernel mode was under 5000 lines. Microsoft claimed that Windows NT 3.51 was a microkernel. It wasn't. It wasn't even close. Even they dropped the claim with NT 4.0. Some microkernels have been quite successful, such as QNX and L4. I can't for the life of me see why people object to the 20% performance hit a microkernel might give you when they program in languages like Java and Perl where you often get a factor 20x performance hit. What's the big deal about turning a 3.0 GHz PC into a 2.4 GHz PC due to a microkernel? Surely you once bought a machine appreciably slower than 2.4 GHz and were very happy with it. I would easily give up 20% in performance for a system that was robust, reliable, and wasn't susceptible to many of the ills we see in today's massive operating systems.
I would now like to correct an error in my original statement. One of the e-mails I got yesterday clarified the origins of Coherent. It was not written by Bob Swartz. He was CEO of the Mark Williams Company. Three ex-students from the University of Waterloo, Dave Conroy, Randall Howard, and Johann George, did most of the work. Waterloo is in Canada, where they also play baseball I am told, but only after the ice melts and they can't play hockey. It took the Waterloo students something like 6 man-years to produce Coherent, but this included the kernel, the C compiler, the shell, and ALL the utilities. The kernel is only a tiny fraction of the total code, so it may well be that the kernel itself took a man year. It took me three years to write MINIX, but I was only working at it only in the evenings, and I also wrote 400 pages of text describing the code in that time period (also in the evenings). I think a good programmer can write a 12,000 line kernel in a year.
If you have made it this far, thank you for your time.
Andy Tanenbaum, 21 May 2004
|Randall Howard 08/16/08 10:15:19 PM EDT|
Oops! December 2000 should read "December 1980" - can't figure out how to edit after posting
|Randall Howard 08/16/08 10:10:21 PM EDT|
Although this article is now four years old, I just stumbled on it doing a different search. I'm one of the three writers of Coherent mentioned in this article. Yes, we all went to University of Waterloo before heading to Chicago to write Coherent. As well, prior to this, when I was working at the Computer Communications Networks Group there, I met Andy Tanenbaum in one of his sojourns. And, I recall him being a really great guy.
Regarding our timeline to develop a kernel, the three of us divvied up the work. Dave Conroy, who had previously written the Decus C compiler (and is now a senior hardware designer at Apple) focused on compiler work as well as a number of other major tasks. I wrote the kernel, many of the libraries and a number of utilities. Johan George did library and utility work. We started in January 1980 and by December 2000 moved from cross-compiling to native builds. This would put the kernel development time at 10+ months. At January 2001 Usenix in San Francisco, I recall Steve Bourne and Dennis Ritchie hacking around with the system and, in particular, analyzing the interrupt latency in the kernel.
Later, when Western Electric lawyers sent Dennis Ritchie up to check on the code he quickly determined it was indeed an original creation. And, before leaving, he commented that the team at Coherent showed "amazing programmer productivity." We were certainly very pleased.
We had a number of innovations such as hot-loadable drivers which I'm not sure Windows ever achieved. It was a lot of fun, but also a lot of years ago.
|Chris Laffra 11/03/04 09:51:30 PM EST|
Having been a student in a couple of Andy's classes in the mid-eighties, when he was busily working on Minix, I feel both priviliged and honered to have done my studies at the Vrije Universiteit in Amsterdam. I still use many of the lessons taught by this kind, very smart, and annoyingly productive individual. [What mere mortal can be a full-time professor and still write an operating system and 14 books, and find time to manage an electoral voting web site?]
I must admit though that I had my private suspicions that Andy must have felt some resentment to Linux's success, and Minix's "lack" of it. I am happy and reassured to find out the opposite.
I advise everyone to follow up Andy's advise and do their Master's at the Free University, which is not entirely free, of course.
|npd 06/01/04 07:08:31 PM EDT|
As you said, if you had "gotten caught up in" producing your own kernel, and given up academia, at least a small portion of the great things that are happening right now, would not be.
For this, as I am sure your students would be honored to do, I salute you. I feel like I know where you are coming from, even though I don't know you.
|Louis HR Muller 05/24/04 11:10:30 PM EDT|
Andrew Tanenbaum has demonstrated that he is a man of character. Had he desired, he could have seized the opportunity to add fuel to the fire by indirectly attacking Linus through Brown. Instead, he supported Linus and has earned the respect of many of the Linux community. As a result, his Minix will receive more of a recognition as an inspiration for Linus and a useful educational tool for all those who did learn more about Unix by using Minix. A reading of the early e-mails put forth by Linus clearly indicate he did not write Linux because he harbored any vengence towards Andrew Tanebaum but rather because creating Linux presented an intellectual challenge for him and the others that became involved in the Linux project. Therefore, indirectly, Mr Brown did a service to both Linus and Andy and strengthened their images.
|Lion Kuntz 05/24/04 08:03:11 AM EDT|
I updated a number of pages on Disinfopedia wiki website to document the culpability of Alexis de SMOKEville's sordid history as a tobacco industry shill. These are some of the new or revised pages, followed by some quotations that search engines will draw upon for results pages. If you post these links in your blogs, you can be pretty sure that every search engine will rank the tobacco connection higher than the FUD pages they post. Over 80% of Americans have quit or never started smoking, so the fans of tobacco shills are always a minority.
The same John M. Olin Foundation funds John M. Olin Center for Policy as funds Alexis de Tocqueville Institution. Olin, Scaife and Koch foundations fund the entire list above, apart from the Congressional Office of Technology Assessment which is funded through campaign contributions instead of foundations. Singer, Tollison and Wagner were all from the George Mason University, favorite charities of right-wing donors and energy billionaires Koch and Scaife."
Search for Robert D. Tollison
32,523 mentions in the Tobacco Institute files ordered online in a court settlement. That's pretty good, but no cigar...
Search for S. Fred Singer
Even though Robert D. Tollison wrote the book on how good for you second-hand smoke is, S. Fred Singer has won the race for covert cash from the disinformation lung polluters of tobacco AND oil companies. Singer helped write the OTHER book that Tollison was only "technical advisor" on, published by Alexis de SMOKEville Institution, er, Alexis de Tocqueville Institution.
In 1994 Cesar Conda was executive director of the Alexis de Tocqueville Institution listed as "Senior Staff and Contributing Associates" on a Lorillard Tobacco Company paid-for publication titled "Science, Economics, and Environmental Policy" by author Kent Jeffreys.  Principal Reviewer was listed as S. Fred Singer, and to give this propagandistic tract a sheen of scientific appearance, a loaded gang of "experts" from assorted tobacco-funded front organizations with impressive names was listed: SEPP, Hoover Institution, John M. Olin Center for Policy, George Mason University.
As Executive Director of Alexis de Tocqueville Institution, Conda had more than a casual association with the production of this deception piece. SEPP was certainly known to him, as an article the same year in Commonsense (Fall 1994) "The New Populism: The Rise of the Property Rights Movement," article by Cesar Conda and Mark LaRochelle, mentions SEPP.  Kent Jeffreys bonafides would also be known to him. Jeffreys at the time was listed as environmental studies director  for Competitive Enterprise Institute , an organization with close ties to Alexis de Tocqueville.
|Ze Plot Thickens 05/24/04 07:52:05 AM EDT|
I'm conducting some research on behalf of the Alexis de Tocqueville
1. Describe the components of an operating system, besides the central
Thanks for your time. Best,
|aNoN 05/24/04 07:46:21 AM EDT|
"Hidden agenda" being amsterdam's coffeeshops and whorehouses, no doubt.
Not that theres anything WRONG with that. Im just saying that any excuse to go to amsterdam is a good one.
What a great city.
|madprof 05/24/04 07:44:09 AM EDT|
Poor old Ken Brown must be wondering how wise it was to have made that particular trip now!
|Minna Kirai 05/24/04 07:42:59 AM EDT|
That Tanenbaum is still antagonistic to Linus's system gives him even more credibility. If a friend vouches for you, that might be discounted as a buddy covering for you- but if an enemy says you're innocent, then he's got no motivation to lie on your behalf.
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
Sep. 1, 2015 03:00 AM EDT Reads: 460
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome,” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
Aug. 31, 2015 09:00 PM EDT Reads: 370
The Internet of Things is in the early stages of mainstream deployment but it promises to unlock value and rapidly transform how organizations manage, operationalize, and monetize their assets. IoT is a complex structure of hardware, sensors, applications, analytics and devices that need to be able to communicate geographically and across all functions. Once the data is collected from numerous endpoints, the challenge then becomes converting it into actionable insight.
Aug. 31, 2015 07:00 PM EDT
Consumer IoT applications provide data about the user that just doesn’t exist in traditional PC or mobile web applications. This rich data, or “context,” enables the highly personalized consumer experiences that characterize many consumer IoT apps. This same data is also providing brands with unprecedented insight into how their connected products are being used, while, at the same time, powering highly targeted engagement and marketing opportunities. In his session at @ThingsExpo, Nathan Treloar, President and COO of Bebaio, will explore examples of brands transforming their businesses by t...
Aug. 31, 2015 06:00 PM EDT Reads: 243
With the Apple Watch making its way onto wrists all over the world, it’s only a matter of time before it becomes a staple in the workplace. In fact, Forrester reported that 68 percent of technology and business decision-makers characterize wearables as a top priority for 2015. Recognizing their business value early on, FinancialForce.com was the first to bring ERP to wearables, helping streamline communication across front and back office functions. In his session at @ThingsExpo, Kevin Roberts, GM of Platform at FinancialForce.com, will discuss the value of business applications on wearable ...
Aug. 31, 2015 03:15 PM EDT
With the proliferation of connected devices underpinning new Internet of Things systems, Brandon Schulz, Director of Luxoft IoT – Retail, will be looking at the transformation of the retail customer experience in brick and mortar stores in his session at @ThingsExpo. Questions he will address include: Will beacons drop to the wayside like QR codes, or be a proximity-based profit driver? How will the customer experience change in stores of all types when everything can be instrumented and analyzed? As an area of investment, how might a retail company move towards an innovation methodolo...
Aug. 31, 2015 02:30 PM EDT Reads: 461
The Internet of Things (IoT) is about the digitization of physical assets including sensors, devices, machines, gateways, and the network. It creates possibilities for significant value creation and new revenue generating business models via data democratization and ubiquitous analytics across IoT networks. The explosion of data in all forms in IoT requires a more robust and broader lens in order to enable smarter timely actions and better outcomes. Business operations become the key driver of IoT applications and projects. Business operations, IT, and data scientists need advanced analytics t...
Aug. 31, 2015 02:30 PM EDT Reads: 420
While many app developers are comfortable building apps for the smartphone, there is a whole new world out there. In his session at @ThingsExpo, Narayan Sainaney, Co-founder and CTO of Mojio, will discuss how the business case for connected car apps is growing and, with open platform companies having already done the heavy lifting, there really is no barrier to entry.
Aug. 31, 2015 02:30 PM EDT Reads: 141
Contrary to mainstream media attention, the multiple possibilities of how consumer IoT will transform our everyday lives aren’t the only angle of this headline-gaining trend. There’s a huge opportunity for “industrial IoT” and “Smart Cities” to impact the world in the same capacity – especially during critical situations. For example, a community water dam that needs to release water can leverage embedded critical communications logic to alert the appropriate individuals, on the right device, as soon as they are needed to take action.
Aug. 31, 2015 12:00 PM EDT
SYS-CON Events announced today that HPM Networks will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. For 20 years, HPM Networks has been integrating technology solutions that solve complex business challenges. HPM Networks has designed solutions for both SMB and enterprise customers throughout the San Francisco Bay Area.
Aug. 31, 2015 11:30 AM EDT Reads: 898
SYS-CON Events announced today that Micron Technology, Inc., a global leader in advanced semiconductor systems, will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Micron’s broad portfolio of high-performance memory technologies – including DRAM, NAND and NOR Flash – is the basis for solid state drives, modules, multichip packages and other system solutions. Backed by more than 35 years of technology leadership, Micron's memory solutions enable the world's most innovative computing, consumer,...
Aug. 31, 2015 11:15 AM EDT Reads: 231
As more intelligent IoT applications shift into gear, they’re merging into the ever-increasing traffic flow of the Internet. It won’t be long before we experience bottlenecks, as IoT traffic peaks during rush hours. Organizations that are unprepared will find themselves by the side of the road unable to cross back into the fast lane. As billions of new devices begin to communicate and exchange data – will your infrastructure be scalable enough to handle this new interconnected world?
Aug. 31, 2015 11:00 AM EDT Reads: 162
Through WebRTC, audio and video communications are being embedded more easily than ever into applications, helping carriers, enterprises and independent software vendors deliver greater functionality to their end users. With today’s business world increasingly focused on outcomes, users’ growing calls for ease of use, and businesses craving smarter, tighter integration, what’s the next step in delivering a richer, more immersive experience? That richer, more fully integrated experience comes about through a Communications Platform as a Service which allows for messaging, screen sharing, video...
Aug. 31, 2015 10:30 AM EDT Reads: 664
SYS-CON Events announced today that Pythian, a global IT services company specializing in helping companies leverage disruptive technologies to optimize revenue-generating systems, has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Founded in 1997, Pythian is a global IT services company that helps companies compete by adopting disruptive technologies such as cloud, Big Data, advanced analytics, and DevOps to advance innovation and increase agility. Specializing in designing, imple...
Aug. 31, 2015 10:15 AM EDT Reads: 306
In his session at @ThingsExpo, Lee Williams, a producer of the first smartphones and tablets, will talk about how he is now applying his experience in mobile technology to the design and development of the next generation of Environmental and Sustainability Services at ETwater. He will explain how M2M controllers work through wirelessly connected remote controls; and specifically delve into a retrofit option that reverse-engineers control codes of existing conventional controller systems so they don't have to be replaced and are instantly converted to become smart, connected devices.
Aug. 31, 2015 09:30 AM EDT Reads: 124
SYS-CON Events announced today that IceWarp will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. IceWarp, the leader of cloud and on-premise messaging, delivers secured email, chat, documents, conferencing and collaboration to today's mobile workforce, all in one unified interface
Aug. 31, 2015 04:00 AM EDT Reads: 429
As more and more data is generated from a variety of connected devices, the need to get insights from this data and predict future behavior and trends is increasingly essential for businesses. Real-time stream processing is needed in a variety of different industries such as Manufacturing, Oil and Gas, Automobile, Finance, Online Retail, Smart Grids, and Healthcare. Azure Stream Analytics is a fully managed distributed stream computation service that provides low latency, scalable processing of streaming data in the cloud with an enterprise grade SLA. It features built-in integration with Azur...
Aug. 28, 2015 07:45 PM EDT Reads: 220
Akana has announced the availability of the new Akana Healthcare Solution. The API-driven solution helps healthcare organizations accelerate their transition to being secure, digitally interoperable businesses. It leverages the Health Level Seven International Fast Healthcare Interoperability Resources (HL7 FHIR) standard to enable broader business use of medical data. Akana developed the Healthcare Solution in response to healthcare businesses that want to increase electronic, multi-device access to health records while reducing operating costs and complying with government regulations.
Aug. 26, 2015 07:00 AM EDT Reads: 195
For IoT to grow as quickly as analyst firms’ project, a lot is going to fall on developers to quickly bring applications to market. But the lack of a standard development platform threatens to slow growth and make application development more time consuming and costly, much like we’ve seen in the mobile space. In his session at @ThingsExpo, Mike Weiner, Product Manager of the Omega DevCloud with KORE Telematics Inc., discussed the evolving requirements for developers as IoT matures and conducted a live demonstration of how quickly application development can happen when the need to comply wit...
Aug. 2, 2015 11:15 AM EDT Reads: 557
The Internet of Everything (IoE) brings together people, process, data and things to make networked connections more relevant and valuable than ever before – transforming information into knowledge and knowledge into wisdom. IoE creates new capabilities, richer experiences, and unprecedented opportunities to improve business and government operations, decision making and mission support capabilities.
Aug. 1, 2015 10:00 AM EDT Reads: 488