(Lots of ((Irritating, Spurious) (Parentheses)))

Mar 04, 2005 14:29


Derisive comments are often made about the syntax of Lisp, as witness some reproaches on my previous blog entry. Thus the half-joking, half-serious backronym of Lots of (Insipid | Irritating | Infuriating | Idiotic | ...) and (Spurious | Stubborn | Superfluous | Silly | ...) Parenthesesand accusations that Lisp syntax would make ( Read more... )

lisp, tao of programming, meta, dynamism, essays, code evolution, en

Leave a comment

averros March 5 2005, 02:23:56 UTC
In the end, when people attack the syntax of Lisp, it is most usually but a rationalization without any technical merit regarding the syntax itself. This rationalization serves to cover up a defense mechanism against a foreign culture.

You are making two assumptions: 1) that this is a foreighn culture, and 2) that this is all a matter of psychological hang-ups of the "mainstream" programmers.

Well, 1) is basically an accusation in being uneducated, or simply unable to comprehend the obvious superiority of your preferred approach. Well, you are making very unwarranted assumptions about your opponents. I am a specialist in programming languages (among other things), wrote some compliers and interpreters, and consider myself rather familiar with LISP and its numerous progeny (Planner, Schema, etc). I would reiterate that any person with university CS education was exposed to LISP, and had a fair chance to learn how to use it. In any case, I do not think that this is an argument which is suitable for a civilized discussion.

The point 2) is contradictory: you are saying that people have psychological problems after insisting that languages should be compared on technical merits. The problem with LISP is not technical. It is psychological. For the vast majority of practical programmers, LISP is decidedly inconvenient. The inability to come up with some truly useful M-syntax is a part of it. Another part is that human cognition does not deal well with recursion or semantic-based recognition. This is well established fact, known to any student of psychology. That's why people prefer "flattened", less compositionally powerful environments, with rich decorative "syntax" - a textbook example is using colorful beer keg handles on control levels of a nuclear plant. Reduces mistakes, an speeds up recognition, you see.

The thrill of coming up with a clever solution for a particular algorithmic need (and LISP is great for expressing novel solutions) is quite familiar to any good programmer. However, when I code something deliverable I, like any seasoned professional, try to avoid doing anything clever, and prefer to use stereotyped, well-understood moves, which are made into reflexes by the years of practice. That's how I'm able to write good-quality code fast enough to meet the typically impossible timelines. Any "cleverness" is a recipe for disaster, and I had quite a few occasions to curse myself for trying to be clever at the expense of immediate understandability and clarity.

So, any psychological crutch (like rich syntax) helps. Coming up with good syntax is artistry and psychology way more than CS (that may explain why most languages are so horrible). It is only too easy to get carried away in any direction, from total austerity (LISP) to profligacy and noise-like compactness (APL, hmm, and to a lesser extent, Perl) or inane verbosity of COBOL. Note that even the purported citadel of syntactic purity was corrupted quickly with not strictly necessary things like quotes, square brackets, and, well, quite a few styles of macros :)

Logically, the infix notation is definitely inferior to postfix or prefix notations; but this is the conventional notation of mathematics. That's what people are being trained to use from mid-school years. Mentally translating from one notation to another is an unnecessary step, which is better done by the software, leaving people with smaller gap between their mental models and the code they need to write or comprehend. Of course, anyone is free to design a language ignoring that training, and to insist that people should rather master thinking in reverse polish notation or something - but what happens is that most people will simply move on to a language more convenient to them. To date, no language insisting on non-infix notation (or ignoring conventional operation precedence rules) gained more than a niche acceptance (Postscript may be an exception, but very few people actually write in it).

Of course, this can be changed. Just rewrite math textbooks, convince educators and everyone dealing with formulae to change their ways. And, yeah, while you're at it, it'd be a good idea to switch to octal from decimal.

Reply

Natural Languages? averros March 8 2005, 00:18:05 UTC
Of course, this can be changed. Just rewrite math textbooks, convince educators and everyone dealing with formulae to change their ways.
Even that only evades the issue. Because why has hundreds of years of mathematics evolved a sub-optimal method of expression? In fact, we might well ask why (all? most? many?) natural languages are complex context-sensitive beasts. Are we merely at the mercy of the very first grammarian, or is their something in our primitive brain which is wired to be more receptive to complex grammars?

Reply

Re: Natural Languages? fare March 8 2005, 00:29:15 UTC
(1) The usual mathematical notation with lots of symbols in an infix way makes a lot of sense when you write on a 2D piece of paper or blackboard. But it's just not adapted to text-based programming.

(2) Grammars are not complex, but big. That is, each rule is simple, but there are lots of rules. A massively parallel brain can cope with that. That is, as long as it has to deal with similar stuff.

(3) Programming is different from the usual stuff. Semantics is part of programming, and it's part of what any programmer must master so as to write decent programs. Some languages make it easier to discuss semantics than others.

(4) Lisp doesn't prevent from learning lots of rules. Actually, in Common Lisp (as opposed to say Scheme), there is a tremendous lot of rules. But these rules are in the semantics, not in the syntax. So that the programmer will focus on what matters rather than be diverted in details.

Reply

averros March 8 2005, 11:36:41 UTC
1) The screen in front of me is flat 2D thingie with somewhat worse image quality than printed paper. Did I miss something? And, yes, it is perfectly capable of displaying math notation, as evidenced by the snippet of paper on the LQG in the adjacent window :)

2) The human brain is massively parallel. And it has the benefit of some billions years of evolution in dealing with large detailed spatial environments. Which do not have any recursion to speak of, are not regular or repetitive, and cannot be combined in any meaningful way. The most important thing is to recognize them by multitude of visual, aural, olfactory or tactile cues - and so not to get lost.

The grammar-capable brain regions are relatively recent, 100k years or so. They're small, too (check Broca's area on a brain anatomy map, for example, and compare with the size of visual cortex (occipital (V1-V4) and inferotemporal lobe). No natural language has anything like deeply nested structures (interestingly, the most "recursive" language is Vietnamese :)

The simple truth is people are very poor thinkers when faced with recursive structures. The 1929 Goedel's results could be well accessible to Aristotle or Mohammed Al Khorezmi, but it took, well, millenia to come with what, in restrospect, seems to be the fundamental but rather simple statement about the foundations and meaning of logic.

Most practical programming projects are, in fact, very similar. There's quite a limited set of movements, algorithms or tricks an "average" programmer has it his bag. Those are practically always sufficient to get the job done.

3) Programming is different from the usual stuff. Yes, it is. It requires people to perform unnatural tasks, all day long. That's why it is much harder than, say, driving. But, at the end of the day, - both professions are about telling machines what to do.

The result is obvious - most people are extremely poor programmers, and even good programmers are apt to make frequent mistakes. A driver with comparable error rate would get himself killed on the very first trip.

Some languages make it easier to discuss semantics than other.

Programming is not about discussing - it is about translating intent into code. The descriptive power is often counterproductive - replacing mechanically learned movements with deliberations and reasoning is a sure way to sabotage the actual task - namely getting from "we want this and that to be done" to "computer: do this and that".

Reply

averros March 8 2005, 11:36:50 UTC
4) Syntax matters. The semantics of all useful languages is identical. Anything which can be written in LISP can be written in C or assembler, or FORTRAN, for that matter, and vice versa. Most likely in a roughly comparable number of tokens, on average. The only difference between languages is, by and large, the syntactic sugar, and the amount of dirty work the language compiler/interpreter and run-time do for the programmer.

LISP per se isn't very semantics-rich language. You can write libraries which do a lot of things in it, but you can do that in any other language as well. If we're talking about semantical power, the "best" language is, of course, Unix shell. It has an entire mature OS backing it. Still, it is only used as a glue, as it sucks in other aspects. (Funnily, at some time I used LISP as a shell, on PDP-11, just for the heck value of it... but in the end I went back to csh - brevity is important in a command-line interface :)

There are other things besides syntax which matter - like, restrictions. The protection of programmer against his own mistakes is an important function of languages, but that, by necessity, limits his freedom of expression. Or take inability to write self-modifying code. From the purely expression-power point of view it is a shortcoming. From the point of view of writing debuggable, robust and reasonably secure software - it is a significant advantage. And the significance of these restrictions grows with the size and complexity of the programming project - nobody in his right mind would do anything as complicated as OS, air traffic control, or (yes) loan processing in an average bank in any language which allows programmer to change module interfaces dynamically.

Cognitively, you don't want a shared terrain to change features rapidly - it will make any kind of collaboration impossible. Of course, this can be achieved with some degree of self-discipline in any language, but it is much better to have discipline actually enforced. You can do that in LISP by insulating a programmer from the core language, but all you get this way is the same C++/Java/Ada, but with verbose syntax.

Reply

averros March 10 2005, 19:15:39 UTC
< you > < are absolutely="right" > < people > < dont > < like deeply="nested" > < expressions > < / expressions > < / like > < / dont > < / people > < / are > < /you >

Reply

ibsulon February 13 2007, 08:32:04 UTC
When was the last time you heard someone who liked XML?

XML was meant to be easy for a machine to parse, not for a human to read. (Humans were supposed to use tools if so necessary.) Further, it was meant for data, not code.

Reply

averros February 13 2007, 16:11:20 UTC
"XML documents should be human-legible and reasonably clear." - The design goals for XML

Reply

XML programming languages... fare February 14 2007, 00:06:02 UTC
Scary as it may seem, some people program in XML. I myself had to use XSLT once... ouch. Plenty of other such languages exist, including Water, MetaL, o:XML, XL, XS, etc.

Reply

averros March 14 2005, 15:47:05 UTC
LISP in college? azimuth0 March 25 2005, 14:33:55 UTC
I utterly disagree with you that people are exposed to modern lisps in college. Just about every CS graduate I've talked with expressed a distaste for lisp because: 1) it's only interpreted, 2) lists are the only datastructure, and 3) it's only about recursion. All of these are demonstrably false, yet the myths persist because of the half-assed way it's presented.

Your "reflexes after years of practice" sounds like code that you should only have to write once. Lisp lets you do that.

Infix vs prefix notation for mathematics is a red herring. Any mathematical expression complicated enough to cause confusion in one notation will still be complicated when converted. In practice, it's a non-issue.

Reply

Re: LISP in college? averros March 30 2005, 01:49:20 UTC
It really does say something (bad) about colleges that teach: 1) that it's interpreted (although the point of a Lisp interpreter in Lisp is not about interpreters); 2) they think it's "only about recursion", when even chapter 1 of SICP explains the difference between recursive /procedures/ and recursive /processes/ vs iterative /processes/, a difference that, eg, C blurs all too well, to the point of causing mental handicap and blindness.
Maybe some have difficulty in learning these concepts, but you can't blame Lisp for that...

Reply

Write it yourself averros March 26 2005, 06:09:00 UTC
If you have such a problem with postfix math, then write an infix to postfix read macro for Lisp. That's what I did for a little exercise playing with the reader macros. It was buggy but worked. I could write the following:

(print {2 + 3 * 4})

and get 14 printed. So get to work and learn something like: infix is a PIA to implement.

Reply

Re: Write it yourself averros February 13 2007, 19:00:04 UTC
What kind of response is "write it yourself"? Why not use a programming language that is actually bundled with a non-trivial parser so you don't have to write your own?

Reply

True cleverness stcredzero February 13 2007, 20:32:15 UTC
You are making two assumptions: 1) that this is a foreighn culture, and 2) that this is all a matter of psychological hang-ups of the "mainstream" programmers.

Well, 1) is basically an accusation in being uneducated, or simply unable to comprehend the obvious superiority of your preferred approach. Well, you are making very unwarranted assumptions about your opponents. I am a specialist in programming languages (among other things), wrote some compliers and interpreters, and consider myself rather familiar with LISP and its numerous progeny (Planner, Schema, etc). I would reiterate that any person with university CS education was exposed to LISP, and had a fair chance to learn how to use it. In any case, I do not think that this is an argument which is suitable for a civilized discussion.

Just because you were exposed to something in university doesn't mean you know very much about it. I spent an entire semester doing Smalltalk and C++ as a graduate student, and was cited as the best Smalltalk programmer by the TA. It wasn't til years after that I realized that I knew nothing about doing good Smalltalk, and that my class was basically useless. (I have been doing Smalltalk now for 9 years.)

Of course, this can be changed. Just rewrite math textbooks, convince educators and everyone dealing with formulae to change their ways. And, yeah, while you're at it, it'd be a good idea to switch to octal from decimal.

Here, you are admitting that it *is* cultural. As a practical matter, it is important. However, this statement doesn't show much self awareness as a programmer.

Any "cleverness" is a recipe for disaster, and I had quite a few occasions to curse myself for trying to be clever at the expense of immediate understandability and clarity.

I call to question your "cleverness" then. True cleverness results in greater clarity while reducing the repetition of code.

It sounds like you repeat lots of code.

Reply


Leave a comment

Up