paul's e-scrapbook

"ALAN KAY" EXCERPTS


Collated by Paul Quek






Click here for short bio





Perhaps it was commercialization in the 1980s that killed off the next expected new thing. Our plan and our hope was that the next generation of kids would come along and do something better than Smalltalk around 1984 or so. We all thought that the next level of programming language would be much more strategic and even policy-oriented and would have much more knowledge about what it was trying to do. But a variety of different things conspired together, and that next generation actually didn't show up. One could actually argue -- as I sometimes do -- that the success of commercial personal computing and operating systems has actually led to a considerable retrogression in many, many respects.

      You could think of it as putting a low-pass filter on some of the good ideas from the '60s and '70s, as computing spread out much, much faster than educating unsophisticated people can happen. In the last 25 years or so, we actually got something like a pop culture, similar to what happened when television came on the scene and some of its inventors thought it would be a way of getting Shakespeare to the masses. But they forgot that you have to be more sophisticated and have more perspective to understand Shakespeare. What television was able to do was to capture people as they were.

      So I think the lack of a real computer science today, and the lack of real software engineering today, is partly due to this pop culture.


    -- "A Conversation with Alan Kay" (adapted)
       Programming Languages (Vol. 2, No. 9 - Dec/Jan 2004-2005)







If you look at software today, through the lens of the history of engineering, it's certainly engineering of a sort -- but it's the kind of engineering that people without the concept of the arch did. Most software today is very much like an Egyptian pyramid with millions of bricks piled on top of each other, with no structural integrity, but just done by brute force and thousands of slaves.

      I would compare the Smalltalk stuff that we did in the '70s with something like a Gothic cathedral. We had two ideas, really. One of them we got from Lisp: late binding. The other one was the idea of objects. Those gave us something a little bit like the arch, so we were able to make complex, seemingly large structures out of very little material, but I wouldn't put us much past the engineering of 1,000 years ago.

      If you look at [Doug] Engelbart's demo [a live online hypermedia demonstration of the pioneering work that Engelbart's group had been doing at Stanford Research Institute, presented at the 1968 Fall Joint Computer Conference], then you see many more ideas about how to boost the collective IQ of groups and help them to work together than you see in the commercial systems today. I think there's this very long lag between what you might call the best practice in computing research over the years and what is able to leak out and be adapted in the much more expedient and deadline-conscious outside world.

      It's not that people are completely stupid, but if there's a big idea and you have deadlines and you have expedience and you have competitors, very likely what you'll do is take a low-pass filter on that idea and implement one part of it and miss what has to be done next. This happens over and over again. If you're using early-binding languages as most people do, rather than late-binding languages, then you really start getting locked in to stuff that you've already done. You can't reformulate things that easily.

      Let's say the adoption of programming languages has very often been somewhat accidental, and the emphasis has very often been on how easy it is to implement the programming language rather than on its actual merits and features. For instance, Basic would never have surfaced because there was always a language better than Basic for that purpose. That language was Joss, which predated Basic and was beautiful. But Basic happened to be on a GE timesharing system that was done by Dartmouth, and when GE decided to franchise that, it started spreading Basic around just because it was there, not because it had any intrinsic merits whatsoever.

      This happens over and over again. The languages of Niklaus Wirth have spread wildly and widely because he has been one of the most conscientious documenters of languages and one of the earlier ones to do algorithmic languages using p-codes (pseudocodes) -- the same kinds of things that we use. The idea of using those things has a common origin in the hardware of a machine called the Burroughs B5000 from the early 1960s, which the establishment hated.

      ... I was there, and Burroughs actually hired college graduates to explain that machine to data-processing managers. There was an immense amount of information available. The problem was that the DP managers didn't want to learn new ways of computing, or even how to compute. IBM realized that and Burroughs didn't.

      ... the original machine [Burroughs B5000] had two CPUs, and it was described quite adequately in a 1961 paper by Bob Barton, who was the main designer. One of the great documents was called “The Descriptor” and laid it out in detail. The problem was that almost everything in this machine was quite different and what it was trying to achieve was quite different.

      The reason that line lived on -- even though the establishment didn't like it -- was precisely because it was almost impossible to crash it, and so the banking industry kept on buying this line of machines, starting with the B5000. Barton was one of my professors in college, and I had adapted some of the ideas on the first desktop machine that I did. Then we did a much better job of adapting the ideas at Xerox PARC (Palo Alto Research Center).

      Neither Intel nor Motorola nor any other chip company understands the first thing about why that architecture was a good idea.

      Just as an aside, to give you an interesting benchmark -- on roughly the same system, roughly optimized the same way, a benchmark from 1979 at Xerox PARC runs only 50 times faster today. Moore's law has given us somewhere between 40,000 and 60,000 times improvement in that time. So there's approximately a factor of 1,000 in efficiency that has been lost by bad CPU architectures.

      The myth that it doesn't matter what your processor architecture is -- that Moore's law will take care of you -- is totally false.

      ...

      ... both Lisp and Smalltalk were done in by the eight-bit microprocessor -- it's not because they're eight-bit micros, it's because the processor architectures were bad, and they just killed the dynamic languages. Today these languages run reasonably because even though the architectures are still bad, the level 2 caches are so large that some fraction of the things that need to work, work reasonably well inside the caches; so both Lisp and Smalltalk can do their things and are viable today. But both of them are quite obsolete, of course.

      The stuff that is in vogue today is only about “one- half” of those languages. Sun Microsystems had the right people to make Java into a first-class language, and I believe it was the Sun marketing people who rushed the thing out before it should have gotten out. They made it impossible for the Sun software people to do what needed to be done.

      ...

      Like I said, it's a pop culture. A commercial hit record for teenagers doesn't have to have any particular musical merits. I think a lot of the success of various programming languages is expeditious gap-filling. Perl is another example of filling a tiny, short-term need, and then being a real problem in the longer term. Basically, a lot of the problems that computing has had in the last 25 years comes from systems where the designers were trying to fix some short-term thing and didn't think about whether the idea would scale if it were adopted. There should be a half-life on software so old software just melts away over 10 or 15 years.

      It was a different culture in the '60s and '70s; the ARPA (Advanced Research Projects Agency) and PARC culture was basically a mathematical/scientific kind of culture and was interested in scaling, and of course, the Internet was an exercise in scaling. There are just two different worlds, and I don't think it's even that helpful for people from one world to complain about the other world -- like people from a literary culture complaining about the majority of the world that doesn't read for ideas. It's futile.

      I don't spend time complaining about this stuff, because what happened in the last 20 years is quite normal, even though it was unfortunate. Once you have something that grows faster than education grows, you're always going to get a pop culture. It's well known that I tried to kill Smalltalk in the later '70s. There were a few years when it was the most wonderful thing in the world. It answered needs in a more compact and beautiful way than anything that had been done before. But time moves on. As we learned more and got more ambitious about what we wanted to do, we realized that there are all kinds of things in Smalltalk that don't scale the way they should -- for instance, the reflection stuff that we had in there. It was one of the first languages to really be able to see itself, but now it is known how to do all levels of reflection much better -- so we should implement that.

      We saw after a couple of years that this could be done much better. The object model we saw after a couple of years could be done much better, etc. So the problem is -- I've said this about both Smalltalk and Lisp -- they tend to eat their young. What I mean is that both Lisp and Smalltalk are really fabulous vehicles, because they have a meta-system. They have so many ways of dealing with problems that the early-binding languages don't have, that it's very, very difficult for people who like Lisp or Smalltalk to imagine anything else.

      Now just to mention a couple of things about Java: it really doesn't have a full meta-system. It has always had the problem -- for a variety of reasons -- of having two regimes, not one regime. It has things that aren't objects, and it has things that it calls objects. It has real difficulty in being dynamic. It has a garbage collector. So what? Those have been around for a long time. But it's not that great at adding to itself.

      For many years, the development kits for Java were done in C++. That is a telling thing.

      We looked at Java very closely in 1995 when we were starting on a major set of implementations, just because it's a lot of work to do a viable language kernel. The thing we liked least about Java was the way it was implemented. It had this old idea, which has never worked, of having a set of paper specs, having to implement the VM (virtual machine) to the paper specs, and then having benchmarks that try to validate what you've just implemented -- and that has never resulted in a completely compatible system.

      The technique that we had for Smalltalk was to write the VM in itself, so there's a Smalltalk simulator of the VM that was essentially the only specification of the VM. You could debug and you could answer any question about what the VM would do by submitting stuff to it, and you made every change that you were going to make to the VM by changing the simulator. After you had gotten everything debugged the way you wanted, you pushed the button and it would generate, without human hands touching it, a mathematically correct version of C that would go on whatever platform you were trying to get onto.

      The result is that this system today, called Squeak, runs identically on more than two dozen platforms. Java does not do that. If you think about what the Internet means, it means you have to run identically on everything that is hooked to the Internet. So Java, to me, has always violated one of the prime things about software engineering in the world of the Internet.

      Once we realized that Java was likely not to be compatible from platform to platform, we basically said we'll generate our own system that is absolutely compatible from platform to platform, and that's what we did.

      Anybody can do that. If the pros at Sun had had a chance to fix Java, the world would be a much more pleasant place. This is not secret knowledge. It's just secret to this pop culture.


    -- "A Conversation with Alan Kay" (adapted)
       Programming Languages (Vol. 2, No. 9 - Dec/Jan 2004-2005)







... the big revelation to me when I was in graduate school [was that Lisp was carefully defined in terms of Lisp] -- when I finally understood that the half page of code on the bottom of page 13 of the Lisp 1.5 manual was Lisp in itself. These were "Maxwell's Equations of Software!" This is the whole world of programming in a few lines that I can put my hand over.

      I realized that anytime I want to know what I'm doing, I can just write down the kernel of this thing in a half page and it's not going to lose any power. In fact, it's going to gain power by being able to reenter itself much more readily than most systems done the other way can possibly do.

      All of these ideas could be part of both software engineering and computer science, but I fear -- as far as I can tell -- that most undergraduate degrees in computer science these days are basically Java vocational training.

      I've heard complaints from even mighty Stanford University with its illustrious faculty that basically the undergraduate computer science program is little more than Java certification.


    -- "A Conversation with Alan Kay" (adapted)
       Programming Languages (Vol. 2, No. 9 - Dec/Jan 2004-2005)







Somebody recently asked me what I am and I answered along the following lines: there is a discipline called mathematics, one called science, and one called engineering, and if those are put in a Venn diagram the intersection of the three is modern-day technology. Engineering was around a lot longer than science, but there is very little engineering that is done today that is not hooked into scientific investigation and scientific results. And math is the lingua franca for both of these disciplines.

      I would say that, temperamentally, I am basically an idealist, which makes me pretty much of a mathematician. Scientists tend to be realists, and engineers pragmatists. I am much more of an idealist than a realist or a pragmatist. In a way, when I think of myself, I think of myself as a scientist more than a mathematician or an engineer, but when I look at what I have done it has been mostly math and engineering and very little actual science. So these are just three ways of dealing with those things.






Check these out:

  • "Predicting The Future" -- By Alan C. Kay (from Stanford Engineering, Volume 1, Number 1, Autumn 1989, pg 1-6)
  • Distant Thunder: Interview in Context Magazine (1999)
  • "The Power Of The Context" by Alan Kay (Remarks upon being awarded — with Bob Taylor, Butler Lampson and Chuck Thacker — the Charles Stark Draper Prize of the National Academy of Engineering, February 24, 2004 )
  • "Software Design, the Future of Programming and the Art of Learning" -- interview with Alan Kay (EduCom Review - March-April 1999)
  • "Revealing the Elephant: The Use and Misuse of Computers in Education" -- By Alan Kay (Educom Review July-August 1996)

  • mp3 -- Alan Kay, "The Computer Revolution Hasn't Happened Yet"

  • Alan Kay -- Viewpoints Research Institute (President: Alan Kay)
  • Alan Kay, by Scott Gasch (mirror of above)

  • Kay, Alan -- "Welcome to Squeakland" (27 December, 2005)
    Homepage of Squeakland
    Homepage of the Squeak programming language
  • Alan Kay Colloquium: Squeaking by on New Ideas

  • Alan Kay -- Smalltalk.orgTM

  • Smalltalk with object-oriented programming pioneer Kay
  • Alan Kay -- From Wikipedia, the free encyclopedia
  • Alan Kay -- minnow.cc.gatech.edu
  • "A Conversation with Alan Kay" -- Programming Languages (Vol. 2, No. 9 - Dec/Jan 2004-2005)
  • Alan Kay -- unrev.stanford.edu
  • Pioneers | Alan Kay | Interface (1972)

  • Daddy, Are We There Yet? A Discussion with Alan Kay
  • Daddy, Are We There Yet? A Discussion with Alan Kay -- mirror page
  • Notes from "Daddy, Are We There Yet?"

  • Smalltalk Creator Wins 'Nobel Prize' of Computing -- Internetnews.com
  • What will Alan Kay do next? -- Macworld.com
  • Watch What I Do -- ebook (Foreword by Alan Kay)





  • Alan Kay Short Bio

    Alan Kay Quotes

    Computer Science / Personal Computing
    (in http://pq.1994.tripod.com)

    CONTENTS page -- paul's (OLD) e-scrapbook

    paul's e-scrapbook

    Paul Quek's Website

    Paul Quek's Website
    (mirror of http://paulquek888.tripod.com)

    RSS Cash Secrets
    (Liz Tomey's FREE content-generating system)

    Creating Wealth