Derisive comments are often made about the syntax of Lisp,
as witness some reproaches on
my previous blog entry.
Thus the half-joking, half-serious backronym of
Lots of (Insipid | Irritating | Infuriating | Idiotic | ...)
and (Spurious | Stubborn | Superfluous | Silly | ...) Parenthesesand accusations that Lisp syntax would make
(
Read more... )
You are making two assumptions: 1) that this is a foreighn culture, and 2) that this is all a matter of psychological hang-ups of the "mainstream" programmers.
Well, 1) is basically an accusation in being uneducated, or simply unable to comprehend the obvious superiority of your preferred approach. Well, you are making very unwarranted assumptions about your opponents. I am a specialist in programming languages (among other things), wrote some compliers and interpreters, and consider myself rather familiar with LISP and its numerous progeny (Planner, Schema, etc). I would reiterate that any person with university CS education was exposed to LISP, and had a fair chance to learn how to use it. In any case, I do not think that this is an argument which is suitable for a civilized discussion.
The point 2) is contradictory: you are saying that people have psychological problems after insisting that languages should be compared on technical merits. The problem with LISP is not technical. It is psychological. For the vast majority of practical programmers, LISP is decidedly inconvenient. The inability to come up with some truly useful M-syntax is a part of it. Another part is that human cognition does not deal well with recursion or semantic-based recognition. This is well established fact, known to any student of psychology. That's why people prefer "flattened", less compositionally powerful environments, with rich decorative "syntax" - a textbook example is using colorful beer keg handles on control levels of a nuclear plant. Reduces mistakes, an speeds up recognition, you see.
The thrill of coming up with a clever solution for a particular algorithmic need (and LISP is great for expressing novel solutions) is quite familiar to any good programmer. However, when I code something deliverable I, like any seasoned professional, try to avoid doing anything clever, and prefer to use stereotyped, well-understood moves, which are made into reflexes by the years of practice. That's how I'm able to write good-quality code fast enough to meet the typically impossible timelines. Any "cleverness" is a recipe for disaster, and I had quite a few occasions to curse myself for trying to be clever at the expense of immediate understandability and clarity.
So, any psychological crutch (like rich syntax) helps. Coming up with good syntax is artistry and psychology way more than CS (that may explain why most languages are so horrible). It is only too easy to get carried away in any direction, from total austerity (LISP) to profligacy and noise-like compactness (APL, hmm, and to a lesser extent, Perl) or inane verbosity of COBOL. Note that even the purported citadel of syntactic purity was corrupted quickly with not strictly necessary things like quotes, square brackets, and, well, quite a few styles of macros :)
Logically, the infix notation is definitely inferior to postfix or prefix notations; but this is the conventional notation of mathematics. That's what people are being trained to use from mid-school years. Mentally translating from one notation to another is an unnecessary step, which is better done by the software, leaving people with smaller gap between their mental models and the code they need to write or comprehend. Of course, anyone is free to design a language ignoring that training, and to insist that people should rather master thinking in reverse polish notation or something - but what happens is that most people will simply move on to a language more convenient to them. To date, no language insisting on non-infix notation (or ignoring conventional operation precedence rules) gained more than a niche acceptance (Postscript may be an exception, but very few people actually write in it).
Of course, this can be changed. Just rewrite math textbooks, convince educators and everyone dealing with formulae to change their ways. And, yeah, while you're at it, it'd be a good idea to switch to octal from decimal.
Reply
Even that only evades the issue. Because why has hundreds of years of mathematics evolved a sub-optimal method of expression? In fact, we might well ask why (all? most? many?) natural languages are complex context-sensitive beasts. Are we merely at the mercy of the very first grammarian, or is their something in our primitive brain which is wired to be more receptive to complex grammars?
Reply
(2) Grammars are not complex, but big. That is, each rule is simple, but there are lots of rules. A massively parallel brain can cope with that. That is, as long as it has to deal with similar stuff.
(3) Programming is different from the usual stuff. Semantics is part of programming, and it's part of what any programmer must master so as to write decent programs. Some languages make it easier to discuss semantics than others.
(4) Lisp doesn't prevent from learning lots of rules. Actually, in Common Lisp (as opposed to say Scheme), there is a tremendous lot of rules. But these rules are in the semantics, not in the syntax. So that the programmer will focus on what matters rather than be diverted in details.
Reply
2) The human brain is massively parallel. And it has the benefit of some billions years of evolution in dealing with large detailed spatial environments. Which do not have any recursion to speak of, are not regular or repetitive, and cannot be combined in any meaningful way. The most important thing is to recognize them by multitude of visual, aural, olfactory or tactile cues - and so not to get lost.
The grammar-capable brain regions are relatively recent, 100k years or so. They're small, too (check Broca's area on a brain anatomy map, for example, and compare with the size of visual cortex (occipital (V1-V4) and inferotemporal lobe). No natural language has anything like deeply nested structures (interestingly, the most "recursive" language is Vietnamese :)
The simple truth is people are very poor thinkers when faced with recursive structures. The 1929 Goedel's results could be well accessible to Aristotle or Mohammed Al Khorezmi, but it took, well, millenia to come with what, in restrospect, seems to be the fundamental but rather simple statement about the foundations and meaning of logic.
Most practical programming projects are, in fact, very similar. There's quite a limited set of movements, algorithms or tricks an "average" programmer has it his bag. Those are practically always sufficient to get the job done.
3) Programming is different from the usual stuff. Yes, it is. It requires people to perform unnatural tasks, all day long. That's why it is much harder than, say, driving. But, at the end of the day, - both professions are about telling machines what to do.
The result is obvious - most people are extremely poor programmers, and even good programmers are apt to make frequent mistakes. A driver with comparable error rate would get himself killed on the very first trip.
Some languages make it easier to discuss semantics than other.
Programming is not about discussing - it is about translating intent into code. The descriptive power is often counterproductive - replacing mechanically learned movements with deliberations and reasoning is a sure way to sabotage the actual task - namely getting from "we want this and that to be done" to "computer: do this and that".
Reply
LISP per se isn't very semantics-rich language. You can write libraries which do a lot of things in it, but you can do that in any other language as well. If we're talking about semantical power, the "best" language is, of course, Unix shell. It has an entire mature OS backing it. Still, it is only used as a glue, as it sucks in other aspects. (Funnily, at some time I used LISP as a shell, on PDP-11, just for the heck value of it... but in the end I went back to csh - brevity is important in a command-line interface :)
There are other things besides syntax which matter - like, restrictions. The protection of programmer against his own mistakes is an important function of languages, but that, by necessity, limits his freedom of expression. Or take inability to write self-modifying code. From the purely expression-power point of view it is a shortcoming. From the point of view of writing debuggable, robust and reasonably secure software - it is a significant advantage. And the significance of these restrictions grows with the size and complexity of the programming project - nobody in his right mind would do anything as complicated as OS, air traffic control, or (yes) loan processing in an average bank in any language which allows programmer to change module interfaces dynamically.
Cognitively, you don't want a shared terrain to change features rapidly - it will make any kind of collaboration impossible. Of course, this can be achieved with some degree of self-discipline in any language, but it is much better to have discipline actually enforced. You can do that in LISP by insulating a programmer from the core language, but all you get this way is the same C++/Java/Ada, but with verbose syntax.
Reply
Reply
XML was meant to be easy for a machine to parse, not for a human to read. (Humans were supposed to use tools if so necessary.) Further, it was meant for data, not code.
Reply
Reply
Reply
Reply
Your "reflexes after years of practice" sounds like code that you should only have to write once. Lisp lets you do that.
Infix vs prefix notation for mathematics is a red herring. Any mathematical expression complicated enough to cause confusion in one notation will still be complicated when converted. In practice, it's a non-issue.
Reply
Maybe some have difficulty in learning these concepts, but you can't blame Lisp for that...
Reply
(print {2 + 3 * 4})
and get 14 printed. So get to work and learn something like: infix is a PIA to implement.
Reply
Reply
Well, 1) is basically an accusation in being uneducated, or simply unable to comprehend the obvious superiority of your preferred approach. Well, you are making very unwarranted assumptions about your opponents. I am a specialist in programming languages (among other things), wrote some compliers and interpreters, and consider myself rather familiar with LISP and its numerous progeny (Planner, Schema, etc). I would reiterate that any person with university CS education was exposed to LISP, and had a fair chance to learn how to use it. In any case, I do not think that this is an argument which is suitable for a civilized discussion.
Just because you were exposed to something in university doesn't mean you know very much about it. I spent an entire semester doing Smalltalk and C++ as a graduate student, and was cited as the best Smalltalk programmer by the TA. It wasn't til years after that I realized that I knew nothing about doing good Smalltalk, and that my class was basically useless. (I have been doing Smalltalk now for 9 years.)
Of course, this can be changed. Just rewrite math textbooks, convince educators and everyone dealing with formulae to change their ways. And, yeah, while you're at it, it'd be a good idea to switch to octal from decimal.
Here, you are admitting that it *is* cultural. As a practical matter, it is important. However, this statement doesn't show much self awareness as a programmer.
Any "cleverness" is a recipe for disaster, and I had quite a few occasions to curse myself for trying to be clever at the expense of immediate understandability and clarity.
I call to question your "cleverness" then. True cleverness results in greater clarity while reducing the repetition of code.
It sounds like you repeat lots of code.
Reply
Leave a comment