Archive for the ‘LISP’ Category

Lisp50 Notes part VI: The Future of Lisp

October 25, 2008

In a sense, I’ve written all the other posts in this series just so you can understand what happened when Rich Hickey gave his talk. Maybe you’ve already read a post from a Lisp hacker that was there, so you think you already know how this ends. Think again. Let me break this down so you can see the full context in which he was speaking:

* We started with the Steele/Gabriel “meta-talk.” As the guy on LispCast said, this talk inadvertently gave an impression of general stagnation among the Common Lisp community.

* We’d heard much of the history of Lisp and Common Lisp… and the story seemed to indicate that the Lisp community spent too much energy fighting among themselves before they could finally realize that C and C++ were their real enemies.

* Fritza Kunze’s idea of completing a new Lisp within 12 months pretty much tanked.

* Those of us that held out hopes for Scheme being the answer to our dreams were disappointed to hear William Clinger describe in exquisite detail all of the problems with the R6RS process.

I don’t know when it was, but at some point a guy toward the front pleaded for just some basic libraries for GUI’s, database access, and web services. Someone tried to punt on that question, but a consensus emerged that it was way too hard for anyone that just wanted to build a stupid application. Later on we would hear about some of the technical reasons behind that difficulty, but still… there was no good answer on the table.

So into that setting comes Rich Hickey. We wanted a new Lisp. A simpler Lisp. A cleaned up Lisp with decent library support. We wanted to be able to use Lisp on our day jobs and not just on “the back forty.” And Rich Hickey basically says to us, “I’ve already got one.” (!)

And not only that… he gave an really good presentation. I admit, I’m a snob. I never really looked at Clojure myself– one mention of the word “Java” and my mind was closed to whatever it had to offer. But he knew all of my objections and had clear and concise answers for them. He also knew about the kinds of frustrations I’ve had with Common Lisp… and he had answers for those as well. But he also knew what makes Lisp great. The thing that clinched it was that he knew exactly why we were so enamored with Alan Perlis. And he could take Perlis’s ideas and take them one step further.

It was electrifying.

This is the thing that sold me: hash tables in Common Lisp are a pain to work with. As Hickey said, “you can’t write lispy code with Lisp hash tables.” But take a look at how you do hash tables in Clojure:

Clojure Data Structures

Clojure Data Structures

In short, Clojure provides better ways to produce homogeneous code that operates on heterogeneous data structures.  Clojure provides more uniformity in data structure access.  Clojure provides something I’ve wanted for months, but never fully imagined on my own.

And unlike some of the talks that were tinged with a little bitterness, a sense of loss, or a sense of nostalgia, Hickey was upbeat and forward looking. “Lisp as an idea is still vibrant, especially among young developers,” he said. “People are excited about Lisp and the idea of Lisp.”

At end of the talk, JonL was the first to speak, as usual. I think even he was temporarily speechless. While he might have had reservations about the hash table trick that I like so much, he still gave his blessing to Hicky: “You love to go to Lisp and you understand why,” he said with clear admiration. The sense of relief that passed over the room at that point was palpable. After JonL was finished responding, a young developer in the back could not contain himself. “This is the best thing I’ve ever heard in any conference,” he said. “Great work.”

And that’s what the Lisp50 conference was about.

Lisp50 Notes part V: Interlisp, PARC, and the Common Lisp Consolidation Wars

October 24, 2008

Okay, quick… how many people besides Alan Kay can you name that worked at Xerox PARC? Not very many, eh? Yeah, that’s your loss. It wasn’t all about the guys like Kay that get all the glory, you know….

Warren Teitelman was by far my favorite speaker. Unfortunately my notes got worse as the day drug on, so I simply cannot do justice to him. Yeah, Guy Steele is…you know… Guy Steele. (OMG! The guy who wrote the lambda papers is sitting right behind me! Uh… can I have your autograph?) Fritz Kunze seemed like he’d have been a great guy to go out drinking with. And we’d heard from John McCarthy himself, for crying out loud…. But Teitelman came off as being at once knowledgeable and approachable. He didn’t exude any of the “dysfunctional Lisp-programmer” type attitudes at all. (Not that anyone else did….) And his work at PARC and InterLisp seemed to address things that the Lisp community as a whole seems to have overlooked somewhat in the years since. He’d been right in the middle of some pretty significant moments of computer history, but he still seemed like a pretty humble guy in spite of it.

Teitelman’s early work had to do with debugging Lisp programs while they were still running. When he presented it at a conference in 1968, he was completely flamed by Dijkstra! “We shouldn’t make it easier for programmers to recover from bugs,” Dijkstra said angrily. (He’d been visibly shaking his head through out the talk even before this outburst.) “How many bugs are we going to tolerate?” he demanded. “Seven,” Teitelman shot back. The audience laughed, of course, but Dijkstra never forgave him. From that day forward, Dijkstra referred to him as the Internationally Renowned Computer Entomologist!

In 1970 he worked on the DWIM features of InterLisp. He’d been discussing the feature for weeks, but somehow the implications of it were just not evident to the people he worked with. The idea was that the computer should be able to automatically correct doubled and transposed characters and such like. He rolled out the code for it into production and soon one of his colleages burst into the room and explained breathlessly: “I was talking on the phone and typed the wrong thing… and… and the computer corrected it!” Until you have something out there, the user can’t be sure what it can do, Teitelman said.

So yes… we were all sitting there listening to the guy that pretty much impelented the first spell checker, the first undo system (for a Lisp environment), and the first online help system. There were some court cases between Microsoft and Apple back in the eighties over who had these features first, but Teitelman of course has printouts from long before either one were on the scene! I think he was one of the first people to get a personal computer. (It cost $150,000…. It was something called a “Dorado”.) And when windowing systems were first developed at PARC, he rewired the Lisp programming tools to work with it. He was the first person to use the term “programming enviornment” in print. Incredible.

He concluded by quoting Alan Kay as saying that there aren’t enough languages these days that are late binding self-reflexive. Teitelman said that we’ve got to strive to separate the business process from the code. Your average developer, though, just thinks, “hey… I can easily build this rule in Java!” Don’t!!! This is a concept that people don’t get. If the program is rapidly changing or needs to look at itself, then you really need those late binding self-reflexive features, he said.

After hearing about many of the wonderful developments in the various branches of Lisp, Kent Pitman told us about extremely Machiavellian process of consolidating them under Common Lisp. (Earlier in the day, Fritz Kunze had remarked that this was a “deeply flawed” process and that the mailing archive revealed it to be an “unfolding disaster.”) Pitman had a strikingly cynical slide detailing the strategies of the process organizers: “due process is an illusion,” it said. Other bullet points sounded like something out the Art of War: “turn opponents against each other and let them wear each other down.” “Soliciting volunteers gives critics something to do,” another said, “which dilutes their passion, pacifies them, and makes them feel involved.” In order to isolate the Interlisp guys, dark rumors were spread about various computing nightmares caused by Teitelman’s DWIM features!

While at MIT, Pitman recounted how he’d written the MacLisp manual and naively signed the copyright over to them. “Oh, it’s just a formality,” they said. Not really understanding what copyright was about, he didn’t really think anything of it. He also didn’t see a dime of the money that MIT brought in with the book sales. Pitman wrote up the standard for the Common Lisp project and was very careful to avoid a similar deal for that. The committee would vote on the truth of the various propositions, but he would do all of the writing. No matter what, he was not going to give up the copyright for any reason!

When ANSI Common Lisp was finally finished, Pitman described how people simply could not come to grips with the fact that it was portable. “What platform is this implementation for?” people would ask. After failing to get people to try it out, these same people would end up calling two years later and say, “hey, you know… we tried it and it works. Can we use this?” Talk abut not getting the point!

He had another slide detailing an eerie phone conversation between him and someone from ANSI. Each side engaged in subtle forms of intimidation to try to get their way. After being pushed to turn it over, Pitman said flatly: “Our commitee just voted on the truth of things, not the text. Copyright covers the form of the text, not the truth.” He held his ground and somehow ANSI figured something out. The tragedy is that you can’t get a decent copy of the language standard anymore! And due to all of these maneuverings, it was years before Pitman could get his work published on the web… (Steele’s CLtL books actually beat him, much to his chagrin….) Stallman was not happy with the way Pitman handled copyrights: “I don’t care about useful,” he said, “it has to be free!”

Pitman seemed to have a knack for being a day late and a dollar short. He decided to write a Lisp based web server back in the nineties and after six months of working on it Franz (I think) came out with a free one. As a result of this experience, he has reservations about not only giving up his copyrights, but also with the open source development model in general….

(I apologize for the poor quality of this post.  There so much that could be done to get this right, but I have limited time, etc.  Hopefully this will suffice to give people a sense of the sessions until something better is released.)

Lisp50 Notes part IV: Fritz Kunze Enters the Lisp Mine Field

October 23, 2008

Fritz Kunze (co-founder of Franz Incorporated) gave a talk called “Careening through Lisp mind fields.” He was very informal in his delivery and had lots of humorous anecdotes to go along with his recollections and “armchair” speculations. Combined with the opening “meta talk” by Steel and Gabriel and also JonL’s extremely informal “meaning of life” type talk, one gets the impression that many of the major players in the Lisp world are unable to go along with any sort of standard social convention. Somehow, some way… they all have to find they own way of doing things and their own quirky personal inflections impact everything they do.

Early on in his talk he told a story about the creator of Eliza: Joseph Weizenbaum. (Weizenbaum died earlier this year, by the way.) The story goes that he walked into someones office and the guy in there says, hey– check out my program. You ask it questions and it answers yes or no! So they tried a few, and sure enough, the answers were right every single time. Weizenbaum asked him if it answers questions that are in German. The guy says, sure, but you have to do the typing. So Weizenbaum types in a couple of questions in German and the answers were correct in both cases. “I have to think about this,” he said, and he stomped off to his office. Not too long later he had Eliza up and running. And the program that inspired him? It just gave a “yes” if there were an odd number of characters in the question! (Weizenbaum would later go through a slightly paranoid/Luddite type phase when he was shocked at how ordinary people ascribed ‘real’ intelligence to his Eliza program. He criticized the role of artificial intelligence in society with his book, Computer Power and Human Reason.)

After showing some images created by an extremely unusual Lisp-based drawing program (and talking about some Eliza spinoffs), he then delved into the link between aspergers, autism, emotional immaturity and programming ability. He had a slide outlining some personality traits and a collective hush fell over the room as we all realized that we were part of some sort of incomprehensible evolutionary programmer-genius phenomenon. Okay, okay… I’m exaggerating. But the slide was interesting: it included stuff like “uncomfortable at parties” and “dislikes travelling”. (Are you a misunderstood genius? Take this quiz!) He talked about some of the problems in managing Lisp guys: “There’s a limit to the number of smart people you can put in a room. Things will break down and they won’t talk to each other.” After some more anecdotes and also some unscientific speculations, he concluded that therapy was the only solution… therapy, that is, for the normal people that have to work with the “asperger” guys! After a while, the managers and salesmen begin to emulate the smart people, he explained. If the company culture as whole absorbs too many “aspergerisms”, then the business ceases to be able to talk to its customers!

He concluded his talk with some ideas about the future of lisp. He thought that it would be possible for Lisp to reposition itself as sort of the ultimate DSL for making web pages. The problem, though, is that the Lisp community is made up of people who can neither compromise nor give up control. He proposed that a non-profit Lisp foundation be set up with the goal of producing a new Lisp within 12 months. “If you study history, you repeat yourself over and over again,” he said. “You’ve got to do something brand new!”

JonL stood up immediately following the talk: “As usual, I couldn’t disagree more with everything you said!” JonL was clearly disgusted, and… uh… unwilling to compromise on any of the points what were presented. “Nothing we can do today can change Lisp’s position in the market,” he concluded.

Lisp50 Notes part III: McCarthy Reaffirms the Importance of Having Access to the Abstract Syntax

October 23, 2008

Due to health reasons, John McCarthy could not attend the conference. Alan Kay was (I hear) heading somewhere else in the evening– it would be a tight fit for his schedule if he was going to be there and also make his other “date”. Seeing as McCarthy couldn’t make it, Kay decided not to come. Instead, Guy Steele conducted the interview by phone.

(What as most striking to me about this was just how careful Steele was and how precise he was with his language. He appeared to already know the answers to all the questions… and if things got sidetracked or drifted off in a random or unclear direction, he always knew just what to say to pull things back together and pin things down. Just observing this interview process, it was clear that Guy was exactly the kind of person that you’d want to have running a massive language standards type project. It’s no accident that Steele is as influential as he is. He was very different from the traditional stereotype of a Lisp hacker. Throughout the day, I’d run into people and say… that’s Guy Steele over there! That’s the guy who wrote the lambda papers!  Unfortunately, this didn’t seem to mean much to the typical conference goer….)

When asked what should have been added to Lisp, McCarthy said he would like to add some direct logic, but he didn’t know how to do that in a good way. Steele mentioned something about the Lisp-Prolog people in an aside. (I think Norvig implemented Prolog in Common Lisp in PAIP….)

When asked what separates a good programming language from a bad one, McCarthy said he had one advocacy: a language should have access to its own abstract syntax. Any program which works with programs involves abstract syntax. If you take Java and you want to write a program that deals with Java programs, then you’ve got to scan it and look up the plus sign and its precedents. (What a mess!)

Steele asked about the variation in the implementation of key-value pairs and association lisps. Was this an accident or a reinvention? McCarthy said he invented both– when he invented the second, he wasn’t thinking about what he’d done the first time.

Steele asked if s-expressions were good or bad. McCarthy said his original idea was to make it like Fortran. (With m-expressions, that is.) Steve Russel took McCarthy’s eval and said you could just program in s-expressions. Being a “conservative guy”, McCarthy thought this was a bad idea. Steele appeared to be slightly taken aback by the idea of such a revolutionary figure describing themselves as “conservative.” McCarthy said, “everybody’s conservative about what they’ve been doing.”

(I have several notes on the significance and uses of abstract syntax, but I don’t understand them. If anyone out there knows what he meant about having different versions of it– say, one to write in and one that’s pretty– then please help elucidate this. He also said something about not having really any good examples of Lisp programs that write programs. I think that’s what he said; I’d like to know more about this, too.)

Steele noted that most new languages have a “boss”. (Perl has Larry, Python has Guido, etc.) Steele then asked who was that boss of Lisp. McCarthy cracked back, “if Lisp has any boss, it’s you! You wrote the language manual!” The audience laughed. “There was no boss,” McCarthy said. “I never attempted to be the boss.” (And of course, as we had seen in the historical perspectives on Lisp earlier in the day, there were tons of Lisp implementations in the sixties and seventies. Each “tribe” was going off in their own directions and people couldn’t really consolidate them at the time because of hardware limitations.) “Even in the early days I wasn’t the boss of Lisp,” he said. “People could get things into Lisp without me knowing it.”

Steele asked what’s his one idea that could have had the most practical impact if it had actually gotten implemented. McCarthy said, if IBM had taken his ideas about time sharing… then things would have been quite a bit better in the 1960’s. At this point, JonL stood up and started clapping.

The final parts of the interview covered Perlis’s attachment to Formula-Algol before “apologizing” and agreeing with the Lisp approach. McCarthy said he had looked at Formula-Algol when they first proposed it and thought it was pretty bad: it overloaded the algebraic operations and applied them to formulas, he said. He concluded by saying he didn’t know what the alternatives were to abstract syntax. The most important problem we can be working on, he said, was formalizing common sense.

Lisp50 Notes part II: Model-View-Controller Considered Harmful

October 22, 2008

Herbert Stoyan was next to talk at the conference. He covered many obscure details of the earliest days of Lisp. My notes are pretty sparse, but I remember him showing several pictures of McCarthy and Minsky– he said, “Marvin Minsky, he hasn’t ever changed. But if I were to show you twenty pictures of McCarthy, he’d be completely different in each one!” Somewhere in there Richard Gabriel (I think) stood up and said that, “John McCarthy is the worst programmer on the planet. He calls me up for the simplest things!” I’m pretty sure it was some sort of strange Lisp-type humor, as everyone laughed.

Stoyan has a Lisp museum in Germany, but he’s in the process of moving everything to the Computer History Museum in Mountain View, California. Everyone with Lisp artifacts needs to contact this guy and find a home for them. (The guy to talk to is Paul McJones.)  As a cautionary tale, he noted that we’ve already lost the first Lisp compiler– the guy who made it was actually a mathematician and didn’t even get a PHD in Computer Science! (Stoyan had a slide of the guy’s photo.) During the questions after the talk, JonL recalled McCarthy talking about one of the newer machines in the early days. When questioned about its memory limitations, McCarthy said, “You can’t possibly cons too much on this new machine!” This story got a good laugh from the crowd.

Pascal Costanza was up next. He spoke on his research about Context Oriented Programming. The basic idea is that the computer should behave differently depending on the situation– for example, if the computer “knew” the user was angry due to heart rate, etc. then it should maybe not do certain things. He noted that you can implement this sort of thing naively by having lots of if-then blocks everywhere. That’s no fun, though. Then he made what I thought was a bombshell of a point: Model-View-Controller spreads code dealing with a single object’s behavior and spreads it all over the system. (He had a slide showning a UML diagram raging out of control.) He said the original idea of OOP was that behavior is defined just in the class where it belongs. With behavior spread all over the object model, objects no longer “know” how to behave!

Here’s how Costanza makes that point in his paper: “Among prominent domains to motivate object-oriented programming, graphics and people are the most widely used examples. A reason for their popularity is the fact that one can easily introduce the idea that objects know how to behave, that is how to react to messages…. However, when programs become more complex, the code for displaying objects is usually not contained in the classes to be displayed because there is a need to have different views on the same objects, often at the same time. Therefore, such code is separated into view objects that need to be notified of changes to model objects (such as instances of Rectangle or Person), leading to variants of the well-known Model-View-Controller (MVC) framework originally introduced with Smalltalk. Unfortunately, this distribution of responsibilities that conceptually belong to a single object complicates the original simplicity of the object-oriented paradigm. For this reason, some more recent object systems like Self and Squeak have even changed their frameworks for presenting objects on the screen back to the original idea that objects maintain their own knowledge about how to display themselves. However, with that they lose the desired property to offer different views of the same objects. ContextL provides an alternative approach that both keeps the conceptual simplicity that all of an object’s behavior is indeed associated with that object and still allows an object to be viewed in different ways depending on the context.”

Costanza then fired up Lispworks Professional Edition and started hammering out code. This was all CLOS code. He wrote some commands to define a class: define-layered-class, define-layered-function, define-layered-method, and deflayer. To call an object in a given context, you call with-active-layers and pass it a list of layers, and method expression against your object: (with-active-layers (employment) (display *pascal*)). He then paused for questions and the crowd went a little crazy. People wanted know how these features interacted with inheritance and how to code it to know that other layers are active given certain conditions. Gregor Kiczales was floored: “We in the CLOS community have failed to explain why multiple inheritance is important. The problem is complicated, but our code is simpler.” He wanted to know what we could do to actually win the “political” argument. Pascal responded, “I don’t care about the political discussions. I think we should be focused on science.” (In an informal discussion out in the hall, someone noted that the Kiczales didn’t think ideas in the lab actually proved anything– in order to really demonstrate the validity of an idea, you have to have a base of users that are actually applying it in a real situation.)

[Note: Gregor Kiczales is one of the co-authors of The Art of the Metaobject Protocol.  Can you imagine presenting a paper about CLOS and having him ask you questions afterward?!  Costanza did not flinch….]

The Ruby coder that had jumped in the DSM fishbowl the night before returned right about then. From the back of the room he asked (somewhat agitatedly, no less) how you could possibly save the contexts in the case of a real web application. (I thing he was coming from a… what is it… a Spring or a Hibernate standpoint or something. Something like that.) Costanza responded that if you knew what you were doing, you could get at everything. Ruby guy also wanted know how you could possibly handle exceptions with this stuff.

Costanza put up a slide showing some benchmarks. Inexplicably, he actually made the code faster in some situations just by adding contexts! Everyone wanted to know how this could be, and Costanza responded that it didn’t mean anything. He said to never optimize on the basis of guesses; efficiency is not important– give us more flexibility, he declared! Kiczales couldn’t handle this– in most of the meetings he’d ever been in, a 1% loss of efficiency would mean he’d get shown the door. “We can not ship something that is less effective than the thing we shipped before,” he said. Costanza said that most of our machines are idle– who cares?! Ruby and Python very popular and they are dog-slow…. One percent? People these days don’t care about a ten percent performance hit!

JonL closed the final remarks with a word of caution about benchmarking. The competitive impulse magnifies trivial differences. Back in the day, a meaningless difference would mean that a Lisp hacker would be dead in the water and have to go back to C. JonL said there should be no more papers on Garbage Collectors unless you had an example. There should be no more benchmarking without theory and no more theory without benchmarking– it’s not the magnitude, it’s the “why” that matters.

Lisp50 Notes part I: JonL Recalls How Sussman Revealed the Nature of Intelligence…

October 22, 2008

This was a totally different crowd than what I saw at the DSM workshop. There were lots more Americans here. You had a mix of bearded Lisp Hackers, a few not-yet-middle-aged guys who were running significant projects, and then some younger guys that were either still in college or were not out for very long.

To understand the conference, you’ve got to understand the backdrop. I’d just sat through a day of presentations that were mostly focused on Java, Eclipse, and UML diagrams. The “last word” of the day was from a Small Talk programmer who said that the competing tools in his area of work had not improved upon the solutions that he’d had 15 years ago or so. Educators attending the symposium were talking over lunch about how they couldn’t find good examples for applying some of the concepts they were teaching. Anybody I met, I would say to them: “I can’t believe I’m going to see John McCarthy and Alan Kay tomorrow!” This didn’t mean anything really to most people I tried to chat with. Collectively, the net effect of the preceding day was to produce a mild sensation of frustration, disappointment, and alienation.

I was nearly late for the opening session of Lisp50. I rushed in just barely in the nick of time… only to find that Lisp50 was starting ten minutes late or so. There was no posting of the schedule anywhere that I saw while the workshop I was in the previous day had a copy of the schedule at every seat and even a booklet of all the papers to be presented. Add to this the fact that many of the Lisp speakers had some sort of trouble with their Power Point slide and that there was no slot on the schedule for a break for an evening meal… and it just appeared that collectively the Lisp community was doing everything it could to meet its own stereotypical expectations!

So the first “talk” was Guy Steel and Richard Gabriel re-enacting a talk they gave back in 1993. This was amusing in places– the audience laughed in most of the same places as the 1993 audience did– but it was a little frustrating. I mean, we have Steel and Gabriel on stage… and they’re reading. I want to know what they have to say about right now. I want them to preach to the choir. I want them explain how to finally win the “language wars” and why it matters. Something. Anything! Instead we get a play by play of how Lisp communities evolved and coagulated in the seventies and eighties. The net effect of this was to leave the impression that we’re reduced to the significance level of, say, a confederate civil-war reenactor. The battle is over. We lost. There’s nothing left but pin down the final details of the history of our side so we can bury it all. Yes, that’s it… that’s is exactly. I felt like I was at a funeral! This “talk” set a tone that our proceedings were going to be more about nostalgia than anything else. (I’m not saying this wasn’t interesting… I’m just talking about the tone here given the context. Maybe I was the only one the felt this way, but I think the general pulse of the crowd was mildly uncomfortable because of these sorts of unspoken memes floating around in our collective unconsciousness.)

JonL White was the next speaker. I simply cannot do justice to what he said and what he represents. Suffice it to say, if all you know about Lisp is some stuff you read in a Paul Graham essay and a few textbooks here and there… then you really don’t know anything. This man has a tremendous intellect… and has had a hand in nearly everything as far as Lisp is concerned. (Did I see that right on the slide of specific language contributions that he was responsible for defun? I wish I had that slide….) Fifty years from now, I’m going to be telling people, “I saw JonL at the Lisp50 conference.” I cannot do justice to this guy’s presentation, so here’s just a few tidbits:

* He used several Alan Perlis quotes to set the tone. These were all great quotes; the same ones from SICP that inspire me so much. JonL really misses Perlis.

* He mentioned something about dealing with the archetypal “bipolar Lisp programmer” types on the job and exclaimed, “how can you supervise a Lisp guy?! You just take what he does…!”

* JonL was responsible for hiring Stallman at the MIT AI Lab. Stallman has contested this and claimed that some random administrator was responsible. JonL explained that the administrator’s don’t decide anything– the way you get in is by making friends with someone in the lab.

* JonL encouraged us all to go to the International Lisp conference. “Lisp is the back forty for a lot of people,” he said. (That is, you do whatever you need to do when you work for the master, but if the work is for yourself, a lot of people tend to do it in Lisp.)

JonL spoke about growing up in the Ozarks. He talked about how his parents influenced him intellectually and genetically. He talked about how racist everyone was there and how he took the slurs personally somehow…. He talked about how he ended hanging out with Jews and how he sort of became, as it were, an “honorary Jew” because they figured if he was so smart, he must be a Jew, too. (I’m trying not to mangle this too badly.) So Sussman asks him one day, “Why are Jews so smart?” Now, JonL was pained at recounting the answer he gave. He confessed to having picked up on some the racism that was endemic to the Ozarks. “Uh… must be their genes,” he’d said back then. Sussman was very irritated at this answer. “WRONG,” he said, “it’s the LAW!”

I have no idea how many people “got” this. JonL wasn’t too sure either. The default assumption is that we’re all rabid libertarians and atheists, I suppose. JonL explained that Sussman was referring to the Torah… the Jewish scriptures. Sussman argued that because Jewish fathers are required to teach their children how to read and write– so they could understand the Law– that this somehow gave them a fantastic intellectual foundation. JonL referred to a talk that Sussman gave somewhere– it was extemporaneous, but better than most prepared talks. (I want to see this talk if anyone finds a recording!) JonL was totally blown away by this talk… the best thing he’d ever heard. But in it, Sussman argued that intelligence was transmittable and improvable. Somehow, if you can (legitimately) learn more words, you expand your ability to think. This tied into symbolic thinking being about giving names to things. (This reminds me of that strange story in Genesis about Adam naming the animals in the garden of Eden.) Sussman said, “great ideas can be understood by an ordinary person.”

(That’s a very encouraging thought. I suppose there’s hope even for us “average developers.” This is all the more reason to continue learning a new programming language every year or so. It kind of gives one a little more respect for all of those among the USA’s founding fathers that could read and write Latin, Greek, and Hebrew….)

JonL had spoken earlier about how he’d gotten a great prep-school type education even though he grew up out in the middle of nowhere in the Ozarks. Farming and industry were all dead there, but… they still had this heritage of education. JonL pointed out how, for us Americans, this traces back to the Puritans. The fundamental concept of our form of government is that man’s government cannot work. That’s why you have all of those checks and balances– those Puritan-types did not trust man to get things right! (Note to self: compare and contrast this with socialism sometime. Try not to use the word “hubris” gratuitously.) Our education was meant to provide a check on our untrustworthy governmental institutions– but just look at how things are developing in, say, the California public schools.  Things do not look good; we’ve lost something….

That’s the gist of JonL’s talk based just on my hasty and imperfect notes. (That last “tidbit” sort of raged out of control there….)  Please feel free to post corrections and clarifications if you feel they are required.  I didn’t get the feel that he was a raving evangelical or anything, but I was surprised by the philosophical bent to his talk.  But what do you expect from a guy that cuts off his tie with a pair of pruning shears…!

OOPSLA 2008 or Bust!

October 16, 2008

Sadly, I have to cut the conference short for work reasons.  (Grumble grumble….)  Much of the stuff I’m missing appears pretty academic, though– way over my head. (But I really was hoping to take in the whole thing…. Sigh.)

Here’s what I’m catching:


The 8th OOPSLA Workshop on Domain-Specific Modeling
Juha-Pekka Tolvanen, Jonathan Sprinkle, Jeff Gray, Matti Rossi
Room: 210
8:30 – 12:00

13:30 – 17:00


Lisp50 – The 50th Birthday of Lisp at OOPSLA 2008
Pascal Costanza, Richard P. Gabriel, Robert Hirschfeld, Guy L. Steele Jr.
Room: 204
8:30 – 17:00

Advanced Codemunging: A Report from the Trenches

May 13, 2008

A year ago, I began studying Lisp and Emacs. I would ultimately get through about half of SICP and code up a few small Lisp projects that mainly showed how far I was from grasping the point. Having Emacs skills did come in handy a few times, though. Engaging in such focused deliberate practice was difficult: I rarely came up with anything truly innovative or useful, but instead had to constantly face my weak areas and incompetencies– and doing all of this pretty much in public would ensure those weaknesses would become notorious among the illuminated. I wasted some time blogging about programming practices and things gradually culminated into a manifesto of sorts. While I now had enough technical know-how to address some of the biggest problems I’d faced as a career programmer writing simple database apps and web pages, I also had a minor crisis in choosing which language to implement my new vision. Circumstances at work (and a large decrease in free time to experiment in) would soon narrow my options for me, though. This blog post is about what happened next.

I got really lucky in the way things panned out in my career. You’ve probably read where Joel Spolsky was talking about how awful it is to get stuck being an in-house developer. More recently, Paul Graham has gone off on how we’re all supposed to be like the start-up coders that he funds. Really, most of us aren’t *that* talented to begin with… or at least, we’re not willing to live the start-up lifestyle or take on the associated risks. But there is another option that can be almost as satisfying as working in a Graham/Spolsky type setting: be the *only* developer in a company that’s large enough to have interesting problems and that’s looking to cut the cord with as much of their legacy software as they can as they move to a new system.

If you can land that sort of job, you’re probably savvy enough to avoid most of the stupidity that comes with dealing with corporate type development. Because you’ll be they only developer on staff, you won’t have to waste time building consensus with fearful managers and mediocre “day coders”– and face it, they would never have gone along with your insane ideas to begin with. Also, you have a huge business value bearing down on the side of good design: you’re essentially getting paid to eliminate and refactor away all of the crappy development projects that have accrued over the years. The scope of such a change is so big (and everyone else is so busy), you’ll have to come up with big architectural ideas on your own and you probably won’t even have to pitch them to anybody except the other techy type people that are in your (small) IT department. You’ll just have to make the right decisions on your own.

Looking back on the most painful moments of my career, I think I have pinpointed the cause of most of the disasters. Because I needed the experience, I allowed myself to get into positions where I had to maintain code that I had no authority over. Whether out of my own fear or by decree of my immediate manager, I was isolated from the problems I was supposed to solve. I would continually get requirements that were just enough wrong to be useless and would often spend as much time removing them as I did implementing them. The common denominator in most successful projects/modifications were in situations where I rewrote entire sections of code (with or without permission) and where I also took the time (with or without permission) to hash things out with some smart folks lower down on the totem pole– people that actually did the work and people that would end up using the application. Talking to people like that would often uncover the real problem– but then I’d have to “sell” a new solution to managers at one or more companies. I’d often spend more time talking about the issues with management than I did writing code. And its those sorts of interactions that become the fodder for an unlimited number of Dilbert strips. And because of all of the bad requirements, IT departments start to withdraw from their users and create buffers of bureaucracy to protect themselves from getting “burned.” You can let this insanity grind you down… but you can also take advantage of it if you play your cards right.

You’ve got to realize that while management types generally can tell you what to do (and sign your paycheck and stuff), they can’t really help you with the bulk of your development work. Too many of them are going to just tell what kind of buttons to put in the application. “I just need one that has ANSWERS written on it.” Right. If you can just get your foot in the door, you won’t be so dependent on them. Any sort of application that you can roll out is your ticket to finding our how things really should work. Talk to your users at all levels. “Joe Six Pack” knows a heck of a lot about the company and will not waste your time with stupid requirements. Generally speaking, no one asks his opinion: he’ll be grateful for giving him a voice and making his work easier.

Growing up as a geek, you probably played off-beat rpg’s and war games in your basement. In order to run them, you had to have the bulk of the system “in your head”. This type of thing is very similar to a company’s overall business process– and your contacts with people on the ground and your familiarity with the back end data structures will allow you get large portions of a company’s business processes “in your head” as well. (As long as there’s some room left up there what with all the AD&D stuff junking it up.) The trick is to address the process issues in a coherent way: what’s convenient for one person may not be efficient for the whole. It’s this kind of knowledge and savvy that separates you from the code-monkies. People that have a slightly smaller interest in programming will often use the position they gain from this to become de facto controllers of the company. But you’re a geek, so you’ll probably just be glad to have the chance to deal with management as an equal instead of as a flunky. That way, you can get a chance to implement the code designs of your dreams! Party!

So, once you’ve got a decent job where you have the authority to develop code effectively, how do EMACS, Lisp, and SICP impact your day to day work? The main area is in your attitude. Even if you’ve never truly mastered those tools, you end up cultivating a conviction that your program should be dynamic and malleable at all levels. Your abhorrence of repetition and bloat will be cranked up to “11” and you’ll just have this default expectation that your code should magically respond to you no matter what you’re trying to achieve. You’ll also be exposed to enough unusual ideas that you’ll see things you can do back in “blub” that you never would have imagined. You’ll have a wider range of programming styles at your disposal, and your ability to discern various “code smells” will be much sharper.

Back on the job, I did choose a ubiquitous “blub” as the primary language. The lesson of Lisp is that it takes very little to engineer a custom programming language: just a few axioms will get you something reasonably complete in short order as long as you’re able to combine the pieces into more complex concepts. I realized that if I stole somebody else’s evaluator, I had 90% of what it took to make a dynamic language. I embedded the evaluator into an “environment” that could manage the overall state. I then wrote a small deck of crappy string functions that I used to “compile” lines of text into “code objects”. From there it was a piece of cake to write code to “execute” the objects by running a series of operations on the environment variables based on what’s specified in a list of those “code objects”. This was all done with the crappiest of quick-and-dirty programming techniques. Just awful. It didn’t take long, though. And it doesn’t have to be great. It just needs to be “good enough” for now.

Once the core of this (albeit awful) custom language is done, you’re (generally) only about two hours away from being able to implement any given language feature that you want to do. And thanks to a small familiarity with Lisp, I realized that there was a lot of slick ideas that are opened up once a programmer has gains control over what gets evaluated and *when*. Macro-like functionality became possible, among other things…. In practice though, coding a new language feature take me maybe ten times as long as it would to write a comparable function to do the same thing. There are things that can be done to cut the time off of that, and it’ll be fun to someday compare and contrast my ideas with what can already be done with List “out of the box.”

What I ended up using my roll-your-own language for was to build a scripting language for customizing my main application. When the application is run, it reads a script to set up the environment. All of the GUI is bound to the environment. Advanced users can customize 90% of the application’s functionality without recompiling– and as a bonus, I have a way to test and experiment with malfunctioning desktops without having to use the old “it works on my machine” excuse. Not quite as good as a REPL prompt, but good enough. (For now.)

Because I’m a database app developer, 80% of my code is about connecting to database X, prompting the user for some stuff, getting a deck of rows, and maybe doing something for each row in the deck. (I get so sick of seeing the same crappy DAO, ADO.Net, and stored procedure call code over and over. Blech!) It sure would be nice to have a language designed completely around doing exactly that. And the thing is… I do! My custom language has all kinds of features for manipulating blobs of text. My “environment” stores the blobs in a prominent place, and the GUI automatically picks them up, displays them, and knows how to operate them. This may sound crazy at first, but I’ve been able to come up some powerful front end features because of this approach. Features that I never would have thought up emerge naturally and give my users an unprecedented amount of functionality. Dozens of minor problems routinely get addressed all at once with relatively simple abstractions.

For the first time, I am coding against the problem at both ends. The blobs of text that represent the fundamental unit of work in my system are abstractions that I once referred to as being “hairy at both ends” after trying describe something Peter Norvig was doing in an code sample. These abstractions allow the functionality of the program to be extended gracefully– and I’m generally only a “very short time” away from being able to extend those functions for either myself on the code side or my users on the GUI side. No… for the first time, I have an architecture that acknowledges myself as being the primary “user”. Refactoring is now easier than any other alternative– and the business case for it can be made convincingly at every stage of the project. I can code features that make it easier to manage growing collections of business object blobs in my heavily syntactic sugared scripting language. New blobs are automatically picked up by the GUI… and serendipitous GUI features emerge naturally because of the consistency inherent in the abstractions. Each layer of the architecture builds on and magnifies the efforts of the underlying layers. And every line of code I write is impacted by my (admittedly mediocre) understanding of recursion and the closure property.

You see a lot out there about those supposedly super-duper type programmers that are worth ten times as much as their lesser colleagues. While I will not claim to be one of those guys, I do feel at least ten times as effective of a programmer as I was a year ago. At any rate, I love my code base, I enjoy improving it continuously, and I enjoy seeing my users immediately benefit from the wacky ideas I come up with back at my desk. For the first time in a long while, I wake up excited every morning and I actually look forward to going to work. Life is good.

Thanks John, Harold, Jay, Peter and Paul. You guys have given me answers to the core problems I face every day– even as a lowly database app developer. I may not have the privilege of being able to code in a lisp dialect professionally, but at least I can now begin to “Greenspun” with the best of them….

Infocom Drive Discovery May Lead to Recovery of MDL Source Code

April 21, 2008

An Infocom drive (or a CD containing the contents of it) has been recovered that contains the source code for every released and unreleased game produced by Infocom.

As a result of this discovery, some private emails from the Infocom Implementors have been published on a blog. Anyone who’s anyone in the world of “interactive fiction” has responded– including Anita Sinclair of Magnetic Scrolls, several Infocom authors/programmers, Graham Nelson and Emily Short.

Lisp enthusiasts will note that the discovery of this drive may lead to the recovery of the source code to the MDL programming language. Zork coauthor Dave Lebling appears to be in contact with the author of the blog (and hopefully the owner of the drive/CD) to discuss this and other topics.


In the comments, Tim Anderson writes,: “the disk seems to be what was shipped to Activision (along with its containing Sun server, of course) when Infocom Cambridge was shut down. No particular reason, therefore, why it should have included the full MDL source–no one at Activision would have known how to do anything with it.”  Meanwhile, Andreas Davour has found an MDL interpreter.  Mark Blank is posting tips and references….  Wow, this is cool.

Another Update: Allen Garvin has posted a link to the MDL primer.  Sweet!

A Quick and Dirty Look at Exploratory Programming in Common Lisp

January 31, 2008

As the uproar subsides over the recent release of Arc, I can’t help but conclude that the folks that are most outraged by the lack of Unicode support would indeed have little interest in the results of Paul Graham’s labors. Their core values and world view are so antithetical to Paul’s, that they simply cannot see the point. You don’t believe that to be true, of course. You just think Paul Graham is stupid. It doesn’t matter that he wrote On Lisp or that guys like Peter Norvig have a few nice things to say about him. You know he’s insane… and somehow, you’re hurt, insulted, and angry all at once. This is understandable, really. Lisp really is sort of an alien technology. The more you use it, the more it infects your brain. You get all wrapped up in what suddenly appears to be the right way of doing things. But this explanation is meaningless to you. You shrug it off and continue to justify your rage.

But at least give me a chance to make one brief analogy. You know how it is with your wife/girlfriend. You want to be nice, so you offer to take her out to eat. You ask her where she wants to go. She says she doesn’t know. You say you’ll take her anywhere she wants. She insists that you pick somewhere that you want to go. You pick a place that you think would be nice– and she immediately says she doesn’t want to go there. You get frustrated and annoyed. Her feelings get hurt because you can’t hide the sharpness in your words… and suddenly you’re stomping off to avoid the litany of complaints that are sure to follow. “You never do x; you always do y; blah blah blah.” As you mope in your computing dungeon, you feel that something significant has transpired. You feel like there was no way you could have won, yet you’re sure you missed something. You, my friend, are now in the dog house indefinitely.

And that’s the truth of it: you live with an “alien.” You look at the same events and view them differently than she does. You think differently. You have different needs. The “alien’s” rationalizations seem incredibly illogical to you… and you just can’t understand it. The same thing is going on between you and Paul. (Not the living together part. You know… the misunderstanding part. It’s an analogy. Stay with me here for a second.) More than likely you’d rather justify your confusion and disappointment. Certainly, sifting through the kibble to find the meat is the last thing on your mind. But I’m going to give you the benefit of the doubt. Perhaps if you had a little more concrete explanation of how Lisp enables exploratory programming… maybe then you would cut Paul some slack. (Just a little, maybe.) Here goes….

Lisp allows you to define symbols that reference atoms and lists. This allows you to create a shorthand for whatever it is that your working on. Below we create a symbol that represents an x-y coordinate and then reference that list in another symbol:

CL-USER> (setf a '(4 2)) 
(4 2) 
CL-USER> (setf b (list 42 a)) 
(42 (4 2))

Now one thing you should remember… that really is a reference to ‘a’ “sitting” in b’s list. If I make a change to ‘a’, then ‘b’ will change, too!

CL-USER> (cadr a) 
CL-USER> (setf (cadr a) 
(4 7) 
(42 (4 7))

I don’t know about you, but in my mainstream OOP language of choice, my tendency is to start with my own Point class in its own code file. Next I’m coding up a seperate Widget class in its own class file. It has an integer property and point property. I code a variety of constructors in both files and code the getters and setters for the properties of each. Then I start feeling guilty about my Widget class because it violates the Law of Demeter, and I want to go back and put a wrapper on the point property so that no user of a Widget object will reach through to the methods and properties on its internal Point object reference.

Notice above, that we could define a getter function with the cadr operater to retrieve the y coordinate of our point. Setf could then be used with that getter function to make an instant setter. So in lisp… once you define a getter, you automatically define a setter, too. This may seem trivial, but as your object model evolves, I sure get tired of mucking around with property definitions in a half dozen files. In Lisp, all of that stuff is optional. You can just jump in and get started. Note also that because lists can reference other lists, you get a “quick and dirty” form of inheritance for free. Sort of. But even better, you don’t have to take any extra effort to write special code for setting up collections and arrays. The list’s hierarchical structure gives that to you for free.

CL-USER> (setf c (list '(x y z) 42 a)) 
((X Y Z) 42 (4 7)) 
CL-USER> (defun tags (x) 
           (car x)) 
CL-USER> (defun weight (x) 
           (cadr x)) 
CL-USER> (defun point (x) 
           (caddr x)) 
CL-USER> (tags c) 
(X Y Z) 
CL-USER> (weight c) 
CL-USER> (point c) 
(4 7)

So above, we demonstrate the creation of property “getter” functions… and also show how one of those properties can be a list of items with practically no extra effort. I know, I know. Strongly typed collections that take advantage of those new fangled generics features are great fun, when you’re prototyping you’re not all that interested in that sort of paranoid defensive restrictiveness. You may not even know that “types” each of your properties should be, yet!

Here we define a symbol that represents a list of our objects. Notice that the list operator is sort of a universal constructor:

CL-USER> (setf d (list (list '(x x) 17 a) 
                       (list '(y z q) 56 a) 
                       (list '(z z z) 73 '(2 1)))) 
(((X X) 17 (4 7)) ((Y Z Q) 56 (4 7)) ((Z Z Z) 73 (2 1)))

We need to instantiate three of our objects and store them in a list? No problem! You just _do_ it…. The dirty thing here is that you don’t have to define any constructors for your objects. Further… you don’t have to have “all” of your object defined or set up in order to test parts of it.

Let’s set up a simple function designed to operate on a list of our objects. (This is a completely arbitrary example.) But wait… we’re not sure what we want to do as we operate on each item. Well… let’s just skip that part and let the caller of the function tell us what to do each step:

CL-USER> (defun do-something (fn &rest args) 
           (funcall fn (car args)) 
           (if (not (null (cdr args))) 
               (apply #'do-something fn (cdr args)))) 

CL-USER> (apply #'do-something #'(lambda (x) (format t ">> ~a~%" (weight x))) 
>> 17 
>> 56 
>> 73

And instead of setting up a variable to hold our list… let’s just use the function with some objects we define on the spur of the moment:

CL-USER> (do-something #'(lambda (x) (format t ">> ~a~%" (weight x))) 
           '((X X) 17 (4 7)) 
           '((Y Z Q) 56 (4 7)) 
           '((Z Z Z) 73 (2 1))) 
>> 17 
>> 56 
>> 73

This may seem weird, but the more you use Lisp, the more natural this becomes. Lists are fundamentally recursive: they’re defined that way, so you tend to write recursive functions to operate on them. Many built in functions will take functions for their arguments, so you tend to do the same thing with your own code. This not only lets you defer some design decisions, but it also lets you build reusable skeleton functions that are broadly applicable. Finally, “rest” parameters give you even more flexibility… and *another* excuse to write recursive functions. It’s trivial to modify a function to utilize the rest parameters, so you end up using it in places you wouldn’t have thought to.

The syntax for what we just did may seem a little hard to keep up with. When do you you use funcall and when do you use apply? When do you use neither and just call the function? It can be frustrating at times, but that’s the sort of thing that motivated some of the changes in the Scheme dialect.

(I know… I could have used a built-in mapping function for the above. But that’s one cool thing about recursion. If you’ve forgotten the syntax or the name of a built-in function, you can generally write a recursive substitute for it in a pinch. This keeps you rolling along even when you are hacking without reference materials or an internet connection handy.)

Anyways, the main thing here is that you don’t have to waste time juggling class files. You don’t waste time writing properties and constructors and overloaded versions of your constructors. You’ll use “getters” as an elementary abstraction barrier of course, but you don’t have to endure constant object instantiation rituals. You just type out the list that represents your object’s state whenever you need a new one. (This means you can write unit tests that are very clean and concise, by the way… and you can even cut and paste them directly into the REPL prompt without near so much fuss and bother as you might have on other platforms.)

Also, you know how annoyed you get with having an IDE reformat your code even when you don’t want it to? Sometimes the strictly enforced “one class to a file rule” can be equally bothersome. I like to group all of my boring “getter” functions together in one place. This helps document the classes’ read/write fields. Also, related methods are kept together even if they belong to different classes. This makes it easier to think about related parts of the program… which leads to better utilities “bubbling” out of the prototyping process. (By the way, the REPL is just plain great. In terms of its significance, I’d say it’s easily up there with Source Control and Unit Tests in terms of how much it impacts your programming. The amount of time developers waste in the compile-crap loop is insane… and it may be that an entire strata of unit tests are necessitated by the lack of a REPL. But that’s a different story.)

Hopefully this little article can give you some insights into some of the “quick and dirty” type things Paul was talking about. This is not advanced voodoo rocket science by any means, but this is stuff you can master fairly quickly as you start programming in the Lisp dialect of your choice.