notes-computer-programming-programmingLanguageDesign-prosAndCons-lisp


landoflisp's 10 reasons why lisp reduces bugs:

http://landoflisp.com/#guilds

notes: mentions dialects Common Lisp, Scheme, Arc, Clojure

---

"What Makes Lisp Different?":

-- http://books.google.com/books?id=QzGuHnDhvZIC

---

http://www.paulgraham.com/diff.html

" 1. Conditionals. A conditional is an if-then-else construct. We take these for granted now. They were invented by McCarthy? in the course of developing Lisp. (Fortran at that time only had a conditional goto, closely based on the branch instruction in the underlying hardware.) McCarthy?, who was on the Algol committee, got conditionals into Algol, whence they spread to most other languages.

2. A function type. In Lisp, functions are first class objects-- they're a data type just like integers, strings, etc, and have a literal representation, can be stored in variables, can be passed as arguments, and so on.

3. Recursion. Recursion existed as a mathematical concept before Lisp of course, but Lisp was the first programming language to support it. (It's arguably implicit in making functions first class objects.)

4. A new concept of variables. In Lisp, all variables are effectively pointers. Values are what have types, not variables, and assigning or binding variables means copying pointers, not what they point to.

5. Garbage-collection.

6. Programs composed of expressions. Lisp programs are trees of expressions, each of which returns a value. (In some Lisps expressions can return multiple values.) This is in contrast to Fortran and most succeeding languages, which distinguish between expressions and statements.

It was natural to have this distinction in Fortran because (not surprisingly in a language where the input format was punched cards) the language was line-oriented. You could not nest statements. And so while you needed expressions for math to work, there was no point in making anything else return a value, because there could not be anything waiting for it.

This limitation went away with the arrival of block-structured languages, but by then it was too late. The distinction between expressions and statements was entrenched. It spread from Fortran into Algol and thence to both their descendants.

When a language is made entirely of expressions, you can compose expressions however you want. You can say either (using Arc syntax)

(if foo (= x 1) (= x 2))

or

(= x (if foo 1 2))

7. A symbol type. Symbols differ from strings in that you can test equality by comparing a pointer.

8. A notation for code using trees of symbols.

9. The whole language always available. There is no real distinction between read-time, compile-time, and runtime. You can compile or run code while reading, read or run code while compiling, and read or compile code at runtime.

Running code at read-time lets users reprogram Lisp's syntax; running code at compile-time is the basis of macros; compiling at runtime is the basis of Lisp's use as an extension language in programs like Emacs; and reading at runtime enables programs to communicate using s-expressions, an idea recently reinvented as XML. "

--- Readable Lisp S-expressions Project http://readable.sourceforge.net/

---

See also pros and cons for specific Lisps, also in this directory ([Self-notes-computer-programming-programmingLanguageDesign-prosAndCons]).


" Old LISPer that I am, I also looked at various current dialects of Lisp and Scheme—but, as is historically usual for Lisp, lots of clever design was rendered almost useless by scanty or nonexistent documentation, incomplete access to POSIX/UNIX facilities, and a small but nevertheless deeply fragmented user community. " -- http://www.linuxjournal.com/article/3882


http://funcall.blogspot.sg/2009/03/not-lisp-again.html

---

" Tac-Tics said...

    The sad reality.
    Tail recursion is a pain in the butt to debug. It turns out while creating stack frames is slow, it's really USEFUL when stepping through a program. If a procedure tail-recurs to another, the stack trace shows a miracle has happened: a procedure is being called from the one right above it, but the one above it never even mentions it. Those kinds of miracles are Bad News.
    On top of that, being forced to create a function inside a function is not as natural. It is recursive, so it needs to be given a name, but the name is always something lame like "iter" or "_fact" or something dumb. In practice for-loop style constructs are more visible and easier to follow.
    First-class functions are very important. However, that particular example isn't a good one. You can do the same in assmebly or C, albeit the syntax in C is gimped. The real power of first-class functions in Lisp comes from the ability to close over local variables in the outer scope.
    March 5, 2009 at 12:18 PM "

---

tutorial: http://cs.ucla.edu/~rosen/161/notes/lisp1.html

--

" [–]rukubites 7 points 2 years ago

Why don't you agree with the article? Lisp (and here I mean Common Lisp), is full of hacks and inconsistencies. It comes with a framework to arbitrarily generate code using the full power of the language (macros), and goes so far as to allow you to go even deeper and alter how the language itself is parsed through read-macros.

You can statically type things in lisp but it is damn messy, and the typing isn't really meant for correctness, but rather as compiler advice for optimization.

The most used part of the language for me - loop - is a hacked together sublanguage in itself, as is the format system for text output.

There are other things too, such as packages and features and readtables, but that is probably enough for now.

Full disclosure: I am a paid CL programmer.

    permalink
    parent

[–]Squirrel_of_doom 10 points 2 years ago

You start out declaring that CL is full of hacks and inconsistencies, but fail to list any. A Turing complete macro system is not a hack, nor is this method of compile time AST manipulation "inconsistent" or theoretically unsound.

Why is static typing messy? (the type value) and (declare (type my-type y z)) are syntactically lengthy, which is why the compiler has a macro system and extensible parser in the first place. Look at GBBopen's typed numerics system for syntax ideas. For example, (& val) => (the fixnum val) and (+& val-1 val-2) => (the fixnum (+ (the fixnum val-1) (the fixnum val-2))). Or roll your own typed-let.

Your attacks on loop and format are silly. You don't have to use either one, and 99% of the time (loop for i from 1 to 10 do (print i)) or (format t "~A" val) will suffice.

All languages have "first 5 minute annoyances", things that bug you in your first 5 minutes of playing with them. If you really are a [smart] paid CL programmer, you should be able to dig much deeper than that.

    permalink
    parent

[–]rukubites 10 points 2 years ago

I love lisp. It fits my mind like no other programming language ever did. I've programmed it - mainly for pay - for about 10 years.

Macros are awesome and complicated and powerful. However there are numerous traps such as compile-time evaluation versus runtime evaluation, variable capture, correct use of gensyms, etc. Read macros are great but frightening too. #. has a specific special variable to turn it off (*read-eval*)! Once you scratch the surface, they are very inelegant, and what is a hack if not an inelegant solution?

The static typing is messy because it is hard to know what to declare to actually get the performance speedups needed. In practice, you have to keep recompiling a function and guess which declarations would be needed for a given speedup. When I did this, it was particularly hard because I was using an alisp system, but alisp's compiler advice was useless, so I had to do a mini port of that part to sbcl just to optimize.

What I tend to actually use for typing is clos and defmethods. I have encountered the gbbopen numerics. They didn't impress me because you could just do (+& val-1 val-2) and think you've optimized (and be wrong). Someone did that to me once.

None of what I said was actually an attack. I love Common Lisp as much as anyone. I have used the full syntax (including typing) of loop and also used just about every feature of format, including even ~. (Did you know that you have to specify the package of the function you put in between the slashes?)

    Your attacks on loop and format are silly. You don't have to use either one, and 99% of the time (loop for i from 1 to 10 do (print i)) or (format t "~A" val) will suffice.

I would (dotimes (i 10) (print (1+ i))) or (princ val) every time. loop is great so you don't have to use five levels of let* indentation or the abominable do/do*. Your trivial examples are trivial.

Format is awesome, but I have to look up http://gigamonkeys.com/book/a-few-format-recipes.html or dig in the hyperspec far too often.

I love loop and format, but they are still hacked together and the result of design by committee. I did mention packages and readtables and features as hacked together things. Also - pathnames, shadowing symbols, the inconsistency of CLOS with respect to core types, etc. etc.

I'm reminded of someone else on reddit who remonstrated me because I said that genetic algorithms were hard. They are conceptually simple (like common lisp), but scratch the surface and try to solve real, difficult problems - you'll find a whole lot of messy, harsh compromises (like what is in the core of common lisp).

Peace. :-)

    permalink
    parent

[–]lvaruzza 1 point 2 years ago

On top of my head comes the Hash API and the lack of a consistent sequence API, like closujure and also the object system, you can't create a generic version of the function + for example (even C++ allows that).

Lisp and haskell also lack a generic stream IO API (java done it right).

    permalink
    parent"

--

http://readevalprintlove.fogus.me/sakura/index.html

in addition to being a good read itself,

has links for:

" The Nature of Lisp by Slava Akhmechet What Made Lisp Different by Paul Graham Why Ruby is an Acceptable Lisp by Eric Kidd Lisp is Not an Acceptable Lisp by Steve Yegge Why I Ignore Clojure by Manuel Simoni Why Scala is an Acceptable Lisp by Will Fitzgerald Plotting and Scheming the Ubiquitous LISP by André van Meulebrouck What can Lisp do that Lua can’t?

.. There was an epic thread on the comp.lang.lisp Usenet list circa 2002 involving part trolling, part rage and part wisdom entitled Why Scheme is not a Lisp?. It’s well worth exploring that thread for a deeper understanding of just what constitutes a Lisp and how Internet communications will be the death of us all. ↩ "

-- lisp and common lisp history:

http://www.lispworks.com/documentation/HyperSpec/Body/01_ab.htm

--

upvote

light3 1 day ago

link

>Lisp is a language that was ahead of its time, but there are language features now that seem beyond Lisp's grasp.

Interested to hear your views, can you provide some examples?

reply

upvote

reikonomusha 1 day ago

link

Type systems: This is the biggest issue in my opinion. Most Lisps don't really have any formal notion of a type system. Common Lisp kind of does; it's pretty baroque, but if you look deep enough, you'll see it's way behind the systems offered by ML derivatives, Scala, or Haskell. Such a thing would be incredibly hard to bolt-on. Shen sort of offers a richer system in very weird syntax, but the compiler just throws that info away and doesn't make it useful. Typed Racket is another approach.

Polymorphism: In Common Lisp, I can't really make efficient, generic data structures. In Haskell, I can, by making the data structure polymorphic. Haskell will know the types at compile time and can optimize accordingly. In CL, I must do ugly things like provide equality predicates to functions, as opposed to having them associated to the data structure itself. François René Rideau has been trying to patch this up by something called the "Lisp Interface Library".

Functional optimizations: In any Lisp, you typically need a special library for doing optimization of functional code. Deforestation and so on can only be done with special packages like reducers in Clojure or SERIES in Common Lisp. Again, they aren't broad enough to cover the language as a whole.

Immutable/persistent data structures: Clojure has this pretty covered. It is possible to implement these data structures in other Lisps, like Common Lisp, but they're not bound to be very efficient.

OS integration: Not much of a comment. For Common Lisp at least, the language was designed without POSIX or Windows in mind. So it has really weird pathname conventions, poor ways of targeting the user environment, a weird idea about files, etc.

Code organization and packaging at the language level: This is an issue with CL and Scheme. Lisp doesn't really have the concept of an explicit API, or modules of code. There's no concept of a compiled shared library. Code is almost always distributed and integrated by source.

...

The list goes on. You can implement lazy data structures in Lisp, but it's hard to really integrate them in the language. Lazy data structures provide tons of benefits, especially by moving the boundaries for abstraction, but there seems little hope to make this a part of Lisp.

A big problem is that even if some of the above concepts are implemented in various languages (and as I stated, some of them have), they're usually implemented as a part of a toy language (even if it's not intended to be a toy), and are never really integrated well with what exists. Because of this, I don't think it's fair to say Lisp has all of these features, even if there exists dialects of Lisp that implement some of them.

reply

--

" urbit 3 days ago

link

You can get rid of the whole name reduction system. Which is hardly trivial. If you assume it, though, it's true that everything else is trivial.

Getting symbol tables, functions, environments, free and bound variables, etc, etc, out of the fundamental automaton, frees you up to design them right at the higher layer where they (IMHO) belong.

This philosophical argument has serious practical ramifications, I think, because it leads directly to the Question of Why Lisp Failed. Why did Lisp fail? Many people say, because it couldn't be standardized properly.

Why couldn't it be standardized? Because the Lisp way is not to start with a simple core and build stuff on top of it, but to start with a simple core and grow hair on it. So you end up with a jungle of Lisps that are abstractly related, but not actually compatible in any meaningful sense. This is because the lambda calculus is an idea, not a layer.

Basically the point of Nock is to say: let's do axiomatic computing such that it's actually a layer in the OS sense. The way the JVM is a layer, but a lot simpler. Lambda isn't a layer in this sense, so it doesn't provide the useful abstraction control that a layer provides. "

--

" dlweinreb 2141 days ago

link parent flag

In all fairness, the manuals that filled a whole shelf documented a lot of major applications. On my desk right now, I have a copy of the O'Reilley book on Subversion (a source control system). I have another book on Emacs. And so on. ALL of those things were covered in that shelf.

Regarding simplicity versus complexity, please see http://www.joelonsoftware.com/items/2006/12/09.html. Different people want different things; you can't just provide the common 20%.

Over the last few days, I have been surveying the WWW for criticisms of Common Lisp. The two that I see most often are: (1) it's too big, and (2) it's missing so many important features like threads, sockets, database connectivity, operating system interoperability, Unicode, and so on. Ironic, no?

It is really too bad that Common Lisp was not defined as a language core, plus libraries. We did originally intend to do that (they would have been called the "White Pages" and "Yellow Pages"), but we were under too much time pressure.

There is no question that Common Lisp is a lot less elegant that it could have been, had it been designed from scratch. Instead, it had two major design constraints: (1) it had to be back-compatible with MacLisp? and Zetalisp in order to accommodate the large body of existing software, such as Macsyma, and (2) it had to merge several post-MacLisp? dialects, in a diplomatic process (run magnificently by Guy L. Steele Jr) that made everyone reasonably satisfied. It was quite literally a design by committee, and the results were exactly what you'd expect.

But the imperative was to get all the post-MacLisp? implementations to conform to a standard. If we failed, DARPA would have picked InterLisp? as the reigning Lisp dialect, and we would have all been in a great deal of trouble. (Look where InterLisp? is today; actually there's nowhere to look.)

You wonder how other people learned to use Symbolics machines. Some of them took courses - we had an extensive education department. Before you say "that proves that it was too complicated", keep in mind that the system was very large and functional because that's what its primary target market wanted. We did not get feedback from customers saying "make it simpler"; we got feedback saying "add more features, as follows". I bet the people who maintain the Java libraries are rarely asked to remove large amounts of the libraries.

I'm not sure what the reference to Steve Jobs is about. Look at how many features the Macintosh has now. It takes a long time to learn all of them. Their documentation is much shorter because they don't give you any; you have to go to the book store and buy David Pogue's "The Missing Manual" books.

I admit that some (not most) of the complexity was gratuitous and baroque, but not because we liked it that way. The complexity (mainly the non-uniformity) of Common Lisp was beyond our control (e.g. the fact that you can't call methods on an array or a symbol, and so on). Some of the subsystems were too complex (the "namespace system", our distributed network resource naming facility) comes to mind.

In summary, I'm sympathetic to what you're saying, but the reasons for the problems were more involved.

-- Dan Weinreb "

"

upvote

dlweinreb 2094 days ago

link

It's true that the system was feature-laden. I think this was more true of the API's than the user interfaces, though, and so I'm not sure that the Steve Jobs reference is exactly appropriate. Steve Jobs is making consumer products; most customers don't care much about the API's.

It was also featureful because we didn't know which features were the ones that would turn out to be most useful; if there had been a second generation, we could have pruned out some of the stuff that never really got used. It was something of a "laboratory" that way.

Also, the kind of people who used Lisp machines, generally early adopter types, really did ask for amazing numbers of features. If you had been there, you would have experienced this. We wanted to make all our users happy by accommodating all their requests. It's probably similar to the reason that Microsoft Word has so many features. Everyone thinks there are too many and has a long list of the ones they'd get rid of; but everyone has a different list! I think Joel Spolsky wrote something very convincing about this topic once but I can't remember where.

Lucid on Suns was eventually as fast, if you turned off a lot of runtime checking and put in a lot of declarations. Later it was even fast if you didn't do that; the computational ecosystem changed a whole lot since the Lisp machine was originally designed. You have to remember how old it was. At the time it came out, it was very novel to even suggest that every AI researcher have his or her very own computer, rather than timesharing! That's early in the history of computers, by today's standards.

No, we didn't teach all of our customers personally, although we did have an education department that taught courses, and some of them learned that way. There were classes in Cambridge and in San Francisco. Allan Wechsler designed the curriculum, and he's one of the best educators I have ever met. (My own younger brother worked as a Symbolics teacher for a while.)

Common Lisp is complicated because (a) it had to be upward-compatible with very, very old stuff from Maclisp, and (b) it was inherently (by the very nature of what made it "Common") a design-by-committee. For example, consider how late in the lifetime of the language that object-oriented programming was introduced. (Sequences and I/O streams should obviously be objects, but it was too late for that. CLOS wasn't even in the original CLtL? standard.)

In other words, I'm mainly not disagreeing with your points, just explaining how things got that way. "

--

" ...

Common Lisp is the combined effort of 8 different Lisp implementation groups* aimed at producing a common dialect of Lisp while allowing each group to exploit its own hardware. Common Lisp is a set of documents, a language design, and a common body of code.

[* These group are: Spice Lisp at CMU, DEC Common Lisp on Vax at CMU, DEC Common Lisp on DEC-20 at Rutgers, S-1 Lisp at LLNL, Symbolics Common Lisp, LMI Common Lisp, Portable Standard Lisp at Utah, and Vax NIL.]

The Common Lisp documentation is divided into four parts, known as the white pages, the yellow pages, the red pages, and the blue pages.

The white pages is a language specification rather than an implementation specification. It defines a set of standard language concepts and constructs that may be used for communication of data structures and algorithms in the Common Lisp dialect. This is sometimes referred to as the ``core Common Lisp language, because it contains conceptually necessary or important features. It is not necessarily implementationally minimal. While some features could be defined in terms of others by writing Lisp code (and indeed may be implemented that way), it was felt that these features should be conceptually primitive so that there might be agreement among all users as to their usage. (For example, bignums and rational numbers could be implemented as Lisp code given operations on fixnums. However, it is important to the conceptual integrity of the language that they be regarded by the user as primitive, and they are useful enough to warrant a standard definition.)

The yellow pages is a program library document, containing documentation for assorted and relatively independent packages of code. While the white pages are to be relatively stable, the yellow pages are extensible; new programs of sufficient usefulness and quality will routinely be added from time to time. The primary advantage of the division into white and yellow pages is this relative stability; a package written solely in the white-pages language should not break if changes are made to the yellow-pages library.

The red pages is implementation-dependent documentation; there will be one set for each implementation. Here are specified such implementation-dependent parameters as word size, maximum array size, sizes of floating-point exponents and fractions, and so on, as well as implementation-dependent functions such as input/output primitives.

The blue pages constitutes an implementation guide in the spirit of the Interlisp virtual machine specification. It specifies a subset of the white pages that an implementor must construct, and indicates a quantity of Lisp code written in that subset that implements the remainder of the white pages. In principle there could be more than one set of blue pages, each with a companion file of Lisp code. (For example, one might assume IF to be primitive and define COND as a macro in terms of IF, while another might do it the other way around.)

At present the white pages portion of Common Lisp is nearly complete, that document being edited by Guy Steele Jr. at CMU. Since Guy Steele is taking a leave-of-absence from CMU to work at Tartan Labs, and since Scott Fahlman, the head of the Spice Lisp project and a major contributor to Common Lisp, wants to return to his AI research, the administrative control of the Common Lisp effort is in question with several important parts left undone. Stanford proposes to complete those parts.

In particular we propose to do three things. .... "

--

"

    The white pages is a language specification rather than an implementation specification. It defines a set of standard language concepts and constructs that may be used for communication of data structures and algorithms in the Common Lisp dialect. [...]
    The yellow pages is a program library document, containing documentation for assorted and relatively independent packages of code. While the white pages are to be relatively stable, the yellow pages are extensible; new programs of sufficient usefulness and quality will routinely be added from time to time. The primary advantage of the division into white and yellow pages is this relative stability; a package written solely in the white-pages language should not break if changes are made to the yellow-pages library.
    The red pages is implementation-dependent documentation; there will be one set for each implementation. Here are specified such implementation-dependent parameters as word size, maximum array size, sizes of floating-point exponents and fractions, and so on, as well as implementation-dependent functions such as input/output primitives.
    The blue pages constitutes an implementation guide in the spirit of the Interlisp virtual machine specification. It specifies a subset of the white pages that an implementor must construct, and indicates a quantity of Lisp code written in that subset that implements the remainder of the white pages. In principle there could be more than one set of blue pages, each with a companion file of Lisp code. (For example, one might assume IF to be primitive and define COND as a macro in terms of IF, while another might do it the other way around.)
    [...]
    [W]e will produce the first version of the blue pages. This requires producing a detailed specification of the subset of the white pages that must be written, expanding on the white pages description where necessary. We will also write, test, and document an implementation of Common Lisp in that subset and make that code available to anyone wanting to implement a Common Lisp. Thus, for any group to implement a Common Lisp, all that will need to be done is to write the specified subset language in whatever other language their hardware supports and to then take a copy of the Lisp code we will have produced which will complete the implementation of the white pages language. "

--

" AK Yes, that was the big revelation to me when I was in graduate school—when I finally understood that the half page of code on the bottom of page 13 of the Lisp 1.5 manual was Lisp in itself. These were “Maxwell’s Equations of Software!” This is the whole world of programming in a few lines that I can put my hand over. " -- http://queue.acm.org/detail.cfm?id=1039523

--

"We did not consider LISP or Scheme because of their unfriendly syntax"

--