LISP 1.5(ID:31/lis006)

LISP major release 


for LISt Processing.

The lingua franca of artificial intelligence  

LISP 1.5 was the classic LISP, the most widely-used release, and the source for all the other LISPy systems, either directly or through inspiration.

NB on "non-descendants":
LISP 2 was on partially an evolution from LISP 1.5 via THE NEW LANGUAGE, and was a parallel development to LISPs 1.6 -> 1.75 -> 1.9.
M-LISP: LISP 1.5 was S-Expression LISP, and there was always intended to be an M-Expression LISP, which was a parallel development.




Related languages
LISP => LISP 1.5   Evolution of
LISP 1.5 => *LISP   Extension of
LISP 1.5 => 2-LISP   Extension of
LISP 1.5 => ABLE   Augmentation of
LISP 1.5 => A-language   Extension of
LISP 1.5 => ALSP   Implementation
LISP 1.5 => AML/X   Positive Strong Influence
LISP 1.5 => ATLAS LISP   Implementation
LISP 1.5 => Autolisp   Implementation
LISP 1.5 => Avalon/Common LISP   Extension of
LISP 1.5 => BaLinda Lisp   Extension of
LISP 1.5 => BALM   Influence
LISP 1.5 => Basic PDP-1 Lisp   Implementation
LISP 1.5 => BCL   Incorporated some features of
LISP 1.5 => BLISP   Implementation
LISP 1.5 => BRAND X   Extension of
LISP 1.5 => CBCL   Dialect of
LISP 1.5 => CHARYBDIS   Written using
LISP 1.5 => CLAM   Written using
LISP 1.5 => CONVERT   Extension of
LISP 1.5 => COWSEL   Negative moderate Influence
LISP 1.5 => Dectab   Written using
LISP 1.5 => Deftab   Written using
LISP 1.5 => DELISA   Extension to
LISP 1.5 => EL1   Influence
LISP 1.5 => ELP   Influence
LISP 1.5 => FLAP   Written using
LISP 1.5 => FLIP   Extension of
LISP 1.5 => Gedanken   Influence
LISP 1.5 => GENTRAN   Written using
LISP 1.5 => Gist   Extension of
LISP 1.5 => GM   Abstract Machine for
LISP 1.5 => GOL   Extension of
LISP 1.5 => GRASPE   Extension of
LISP 1.5 => Hart   Extension of
LISP 1.5 => JOSS II   Influence
LISP 1.5 => LEM   Influence
LISP 1.5 => Linear lisp   Extension of
LISP 1.5 => LISP 1.6   Evolution of
LISP 1.5 => LISP O2   Evolution of
LISP 1.5 => Lispkit Lisp   Stricter version of
LISP 1.5 => LITHP   Influence
LISP 1.5 => LLOGO   Written using
LISP 1.5 => LOGLISP   Extension of
LISP 1.5 => LOGLisp   Extension of
LISP 1.5 => LOGO   Influence
LISP 1.5 => LOGO   Written using
LISP 1.5 => LOOPS   Extension of
LISP 1.5 => LPL   Derivation of
LISP 1.5 => MACLISP   Evolution of
LISP 1.5 => Macrosal   Extension of
LISP 1.5 => MACSYMA   Written using
LISP 1.5 => MADCAP VI   Influence
LISP 1.5 => MagmaLISP   Extension of
LISP 1.5 => MDL   Extension of
LISP 1.5 => METEOR   Written using
LISP 1.5 => MIX   Augmentation of
LISP 1.5 => MLISP   Evolution of
LISP 1.5 => MLISP   Co-development
LISP 1.5 => More array theory   Incorporated features of
LISP 1.5 => MuSimp   Simplification of
LISP 1.5 => NPL   Strong Influence
LISP 1.5 => Ontic   Based on
LISP 1.5 => PICTUREBALM   Extension of
LISP 1.5 => PIVOT   Based on
LISP 1.5 => PLANNER   Dialect of
LISP 1.5 => Poplar   Derivation of
LISP 1.5 => POPLOG   Implementation
LISP 1.5 => QA3   Augmentation of
LISP 1.5 => Qlambda   Extension of
LISP 1.5 => QLOG   Extension to
LISP 1.5 => REDUCE   Written using
LISP 1.5 => RLISP   Evolution of
LISP 1.5 => SIGMA   Influence
LISP 1.5 => Spice Lisp   Evolution of
LISP 1.5 => Standard Lisp   Subset
LISP 1.5 => The New Language   Evolution of
LISP 1.5 => TLISP   Implementation
LISP 1.5 => Trilogy   Incorporated some features of
LISP 1.5 => Waterloo LISP   Implementation
LISP 1.5 => XLISP   Extension of

References:
  • McCarthy, J. Recursive functions of symbolic expressions and their computation by machine, Part I view details
          in [ACM] CACM 3(04) April 1960 view details
  • McCarthy, J. "A basis for a mathematical theory of computation" pp225-238 view details
          in [JCC 19] Proceedings of the Western Joint Computer Conference, May 1961 view details
  • Teagher - review of Mcarthy (1960) view details Extract: Review
    MCCARTHY, JOHN. Recursive functions of symbol expressions and their computation by machine, part I. Comm. ACM 3, 4 (April 1960), 184-195.

    In this paper, the author introduces a mathematical formalism for computing recursive functions; a list processing language (LISP) for computing partial functions of Symbolic Expressions; and finally, the realization of this language as a programming system for the 704 computer.

    The mathematical formalism is built around conditional expressions of the form: (pi ~ e1, p2 ~ e2 ~ ~ Pn ~ en), where the pj'S are predicates having the values of Truth, Falsity, or else are undefined. The value of the conditional expression is the expression ej, if pj is true and all pk. k < j, are false. If no such pj exists, the conditional is undefined.

    After clearly distinguishing between forms and functions, the author then introduces Church's Lambda notation, by means of which functions can be expressed in terms of the basic form and ordered lists of the variables and their arguments. Thus, the form X2-y3/z would be expressed as \((x, y, a), X2-y3/z)' and, if paired with an ordered list of arguments (m, n2, na), would produce an unambiguous value for the function. Such Lambda expressions can then be defined recursively by conditional expressions through the use of a second notational convention which allows the function to use itself as an argument.

    The types of lists dealt with in the system are called S- Expressions, and are in the form of an ordered pair. Each element of the pair may be either another S-Expression or else any distinguishable symbol. Such a symbol, which may be a collection of characters, is called atomic to distinguish it from an S-Expression. The LISP language has as its basic set of operations two predicates and three basic " SFunctions" which when used in conjunction with conditional expressions and recursive definition allow the building of a very powerful set of operations. The two predicates atom (x) and eq (x' y) give the value of Truth depending upon whether the list is composed of a single atomic symbol, or the two atomic symbols x and y are equal. The three basic "S-Functions" are "car (x)" which has for its value the first element of the S-Expression x; "car (x)" which has for its value the other half of the S-expression; and "cons (x, y)" which forms a new S-Expression comprised of the two arguments. (Car and cdr are undefined for the case of an atomic argument.)

    The author introduces a number of specialized S-Functions and predicates of S-Expressions which are defined recursively from the elementary ones. A special notation and set of conventions are then defined in order to describe S- Functions themselves as S-Expressions using the normal, rather limited character set available on most existing computers. The LISP programming system as it existed on the 704 is then described in moderate detail, covering topics such as the actual representation of the S-Expressions as a list structure in the computer, how S functions were represented by programs, and the mechanics of pushdown lists for recursive computation.

    The author then generalizes S-Expressions (which are two- element lists) to allow L-Expressions composed of many- element lists. The corresponding language, called "Linear LISP" is then introduced, but not fully explored. The final discussion in the paper pertains to the parallel which can exist between recursive functions and a program flow chart.

    It is far easier to criticize than to create, and nothing that the reviewer says should be allowed to detract from the fact that this system represents a pioneering attempt to slide a reasonable mathematical foundation under a rather slippery subject, as well as the fact that the programming system described exists, and has been used with marked success for many symbol processing problems. (These are, however, not described in this paper.) As a presentation, this paper puts a needless burden on the reader by a rather loose introduction and frequent change of notational conventions, symbols, and definitions, many of which seem to be introduced rather casually as they are needed.

    To come to questions of more substance, "undefined" expressions and the corresponding recursions which might never terminate were inadequately treated, and might well be vital in any real usage of the system. The reviewer feels that several rather important issues pertaining to machine capacity and speed have been left unanswered by this as well as other similar papers. Undoubtedly, special languages are needed for manipulating lists. There appear to be at least two tacitly acknowledged criteria for such languages: First, all symbols, with the exception of punctuation marks, should be treated as if they were created equal. That is, for full generality (or perhaps democracy) no concept of number or measure should be allowed to creep into the formal system. Second, one should somehow start off with the bare minimum of primitive operations from which all others can be (and presumably are) built up. While it would appear that this leads to a more elegant mathematical formulation of the system, there is a serious reservation that it leads to difficult notation problems for the user as well as an inefficient computing scheme for the machine. Could it be that we are in a sense reducing the machine to our own level? Some functions undoubtedly should be defined recursively, and some lists should be scanned by starting at the first element and working through it essentially by brute force, but if a recursive operation is always used seriously as a computational procedure some rather startling inefficiencies can be guaranteed. For example, addition or multiplication by recursion, or for that matter, sorting a list into order, are rather horrible to contemplate.

    It would appear that before any language, such as LISP, can lay claim to be a basis for a theory of computation, it must be armed with tools that would enable it to determine its own efficiency as a means to achieve a given end, vis-a-vis other methods.

    Herbert M. Teager, Cambridge, Mass.

          in ACM Computing Reviews 2(01) January-February 1961 view details
  • Yasaki, Edward K. "Computing at Stanford: Scholars & Throughput" view details Extract: Cutting Edge Computing at Stanford
    "The computation center also has the responsibility of supplementing the classroom instruction of the CSD - the educational role ? by teaching programming and making available machine time," Forsythe said. The CSD last spring offered 18 hours of instruction, including what has been called "the second most popular course on campus": Use of Automatic Digital Computers (three credit hours). Other courses are Computer Programming for Engineers, Numerical (and Advanced Numerical) Analysis, Intermediate (and Advanced) Computer Programming, and such seminar-type courses as Computer Lab, Advanced Reading and Research, and Computer Science Seminar. Selected Topics in Computer Science this fall will cover computing with symbolic expressions and the LISP language ... in the winter, a mathematical theory of computation . . . and spring, artificial intelligence.
    Last year, 876 students were enrolled in CSD courses, of whom more than 50 per cent were graduate students. (Total enrollment at Stanford is approximately 10,000, of whom 5,000 are graduate students.) During the current school year, 600-700 are expected in introductory programming courses, and 70 in the master's degree program. A PhD in numerical analysis is also offered, five graduates having been produced so far, and work has begun on a similar program in computer sciences.
    The majority of programs, and almost all students' work, has been written in BALGOL (Burroughs Algebraic Language), for which the 7090 has a translator written at Stanford and based on a previous translator for the 220. In timing runs, BALGOL was found to compile three to 10 times faster than FORTRAN, although it is five to 10 per cent slower on the object program. In a test case, a problem of 150 equations (150 unknowns) had a compile-and-go time of 2.1 minutes in FORTRAN, and 2.5 minutes in BALGOL; the latter time was reduced to 1.6 minutes by coding the inner loop in machine language. The same problem in B5000 ALGOL ran 3.2 minutes, a respectable figure considering that the 90's memory is almost three times faster than the 5000's. In another test, a 7090 FORTRAN program with a run time of two minutes was translated and run on the 5000 in four minutes.
    In a quite different test, involving the sorting of the columns of a two-dimensional array, the following times were found:

         run time      total time
    B5000 ALGOL      405 sec.      about 425 sec.
    7090 FORTRAN      47 sec.      72 sec.
    7090 BALGOL      71 sec.      84 sec.
    Student utilization of the 90 during the last school year, however, was only 22 per cent; sponsored and unsponsored research projects took up the remainder ? 52 and 28 per cent, respectively. With the installation of the 5000, the campus language will be switched to ALGOL. "The new machine will be used for the job-shop work," Forsythe says, "for which it is ideal. That will make the 90 available for experiments in time-sharing and other research."
    This, then, is the third area of activity of a university computing center ? research which will increase man's understanding of the computing sciences. Under Forsythe's direction, Stanford computer activities have moved from under its heavy numerical analysis orientation toward more immediate or popular research ? time-sharing and list processing: The recruitment of John McCarthy from MIT and the recent courtesy appointment of GE's Joe Weizenbaum as research associate signify the development of a CSD faculty resembling Forsythe's projections.
    McCarthy was one of four who delivered a time-sharing paper at the last Spring Joint Computer Conference. The system reported on then includes five operator stations on-line with an 8K PDP-1. The PDF is connected to a drum memory with which it swaps 4K words as each user's main frame time comes up. Thus, as far as the user is concerned, he faces a 4K machine. Stanford's PDP is being replaced in February with a 20K model with 12 stations, each with a keyboard and display console. Early next year, McCarthy says, the system should be on the air, and experimentation undertaken in preparation for its use with the 7090 and in the computer-based learning and teaching lab. To be established under a one-megabuck Carnegie grant, the lab's physical plant will be an extension of Pine Hall.
    Says Forsythe, "If you can see two or three years ahead, you're doing well in this field. But we're all looking toward the wide availability of computers to people. Our long-range goal is to continue development of the computer as an extension of the human mind?and enable almost anybody with a problem to have access to computers, just like the availability of telephone service."
    McCarthy is also working on the Advice Taker system in his three-year ARPA-funded research in artificial intelligence. Designed to facilitate the writing of heuristic programs, the Advice Taker will accept declarative sentences describing a situation, its goals, and the laws of the situation, and deduce from it a sentence telling what to do.
    "Out of optimism and sheer necessity of batch processing, we dream that computers can replace thinkers," says Forsythe. "This has not yet been realized, but there have been valuable by-products from this work." Forsythe and Prof. John G. Herriot have grants from the Office of Naval Research and the National Science Foundation, respectively, for work in numerical analysis.
    Forsythe, associate director R. Wade Cole, and Herriot are all numerical analysts. Herriot returned last summer from a year's sabbatical which he spent teaching numerical analysis under a Fulbright grant at the Univ. of Grenoble in France, and completing his recently-published book, "Methods of Mathematical Analysis and Computation" (Wiley).
    "When we get the new PDP, I hope to improve my chess and Kalah-playing programs," the heavily-bearded, bespectacled McCarthy says. His chess program can use considerable  improvement,   he   adds,   and   the  enlarged PDP will give him this opportunity.
    The most popular program, however, is the PDP's "space war," written by the programmer Steve Russell, who worked under McCarthy at MIT and was brought to Stanford from Harvard. Aptly described as the ideal gift for "the man. who has everything," space war is a two-dimensional, dynamic portrayal of armed craft displayed on the console scope. Each player utilizes four ! Test Word switches on the console to control the speed and direction of his craft, the firing of "bullets" and, with the fourth, the erasure of his craft from the CRT (to avoid being hit ? a dastardly way out). With each generation of images on the scope, each player's limited supply of "fuel" and "bullets" is pre-set, and a new image is automatically, generated after a player scores a hit. Much to the consternation of the CSD staff, the game is a hit with everyone; by executive fiat, its use has been restricted to non-business hours.
    The young and jovial Russell, whose character and demeanor fit the nature of space war, is presently working onLISP-2.
          in Datamation 7(12) Dec 1961 view details
  • Yasaki, Edward K. "Computing at Stanford: Scholars & Throughput" view details Extract: Cutting Edge Computing at Stanford
    "The computation center also has the responsibility of supplementing the classroom instruction of the CSD - the educational role ? by teaching programming and making available machine time," Forsythe said. The CSD last spring offered 18 hours of instruction, including what has been called "the second most popular course on campus": Use of Automatic Digital Computers (three credit hours). Other courses are Computer Programming for Engineers, Numerical (and Advanced Numerical) Analysis, Intermediate (and Advanced) Computer Programming, and such seminar-type courses as Computer Lab, Advanced Reading and Research, and Computer Science Seminar. Selected Topics in Computer Science this fall will cover computing with symbolic expressions and the LISP language ... in the winter, a mathematical theory of computation . . . and spring, artificial intelligence.
    Last year, 876 students were enrolled in CSD courses, of whom more than 50 per cent were graduate students. (Total enrollment at Stanford is approximately 10,000, of whom 5,000 are graduate students.) During the current school year, 600-700 are expected in introductory programming courses, and 70 in the master's degree program. A PhD in numerical analysis is also offered, five graduates having been produced so far, and work has begun on a similar program in computer sciences.
    The majority of programs, and almost all students' work, has been written in BALGOL (Burroughs Algebraic Language), for which the 7090 has a translator written at Stanford and based on a previous translator for the 220. In timing runs, BALGOL was found to compile three to 10 times faster than FORTRAN, although it is five to 10 per cent slower on the object program. In a test case, a problem of 150 equations (150 unknowns) had a compile-and-go time of 2.1 minutes in FORTRAN, and 2.5 minutes in BALGOL; the latter time was reduced to 1.6 minutes by coding the inner loop in machine language. The same problem in B5000 ALGOL ran 3.2 minutes, a respectable figure considering that the 90's memory is almost three times faster than the 5000's. In another test, a 7090 FORTRAN program with a run time of two minutes was translated and run on the 5000 in four minutes.
    In a quite different test, involving the sorting of the columns of a two-dimensional array, the following times were found:

         run time      total time
    B5000 ALGOL      405 sec.      about 425 sec.
    7090 FORTRAN      47 sec.      72 sec.
    7090 BALGOL      71 sec.      84 sec.
    Student utilization of the 90 during the last school year, however, was only 22 per cent; sponsored and unsponsored research projects took up the remainder ? 52 and 28 per cent, respectively. With the installation of the 5000, the campus language will be switched to ALGOL. "The new machine will be used for the job-shop work," Forsythe says, "for which it is ideal. That will make the 90 available for experiments in time-sharing and other research."
    This, then, is the third area of activity of a university computing center ? research which will increase man's understanding of the computing sciences. Under Forsythe's direction, Stanford computer activities have moved from under its heavy numerical analysis orientation toward more immediate or popular research ? time-sharing and list processing: The recruitment of John McCarthy from MIT and the recent courtesy appointment of GE's Joe Weizenbaum as research associate signify the development of a CSD faculty resembling Forsythe's projections.
    McCarthy was one of four who delivered a time-sharing paper at the last Spring Joint Computer Conference. The system reported on then includes five operator stations on-line with an 8K PDP-1. The PDF is connected to a drum memory with which it swaps 4K words as each user's main frame time comes up. Thus, as far as the user is concerned, he faces a 4K machine. Stanford's PDP is being replaced in February with a 20K model with 12 stations, each with a keyboard and display console. Early next year, McCarthy says, the system should be on the air, and experimentation undertaken in preparation for its use with the 7090 and in the computer-based learning and teaching lab. To be established under a one-megabuck Carnegie grant, the lab's physical plant will be an extension of Pine Hall.
    Says Forsythe, "If you can see two or three years ahead, you're doing well in this field. But we're all looking toward the wide availability of computers to people. Our long-range goal is to continue development of the computer as an extension of the human mind?and enable almost anybody with a problem to have access to computers, just like the availability of telephone service."
    McCarthy is also working on the Advice Taker system in his three-year ARPA-funded research in artificial intelligence. Designed to facilitate the writing of heuristic programs, the Advice Taker will accept declarative sentences describing a situation, its goals, and the laws of the situation, and deduce from it a sentence telling what to do.
    "Out of optimism and sheer necessity of batch processing, we dream that computers can replace thinkers," says Forsythe. "This has not yet been realized, but there have been valuable by-products from this work." Forsythe and Prof. John G. Herriot have grants from the Office of Naval Research and the National Science Foundation, respectively, for work in numerical analysis.
    Forsythe, associate director R. Wade Cole, and Herriot are all numerical analysts. Herriot returned last summer from a year's sabbatical which he spent teaching numerical analysis under a Fulbright grant at the Univ. of Grenoble in France, and completing his recently-published book, "Methods of Mathematical Analysis and Computation" (Wiley).
    "When we get the new PDP, I hope to improve my chess and Kalah-playing programs," the heavily-bearded, bespectacled McCarthy says. His chess program can use considerable  improvement,   he   adds,   and   the  enlarged PDP will give him this opportunity.
    The most popular program, however, is the PDP's "space war," written by the programmer Steve Russell, who worked under McCarthy at MIT and was brought to Stanford from Harvard. Aptly described as the ideal gift for "the man. who has everything," space war is a two-dimensional, dynamic portrayal of armed craft displayed on the console scope. Each player utilizes four ! Test Word switches on the console to control the speed and direction of his craft, the firing of "bullets" and, with the fourth, the erasure of his craft from the CRT (to avoid being hit ? a dastardly way out). With each generation of images on the scope, each player's limited supply of "fuel" and "bullets" is pre-set, and a new image is automatically, generated after a player scores a hit. Much to the consternation of the CSD staff, the game is a hit with everyone; by executive fiat, its use has been restricted to non-business hours.
    The young and jovial Russell, whose character and demeanor fit the nature of space war, is presently working onLISP-2.
          in Datamation 7(12) Dec 1961 view details
  • McCarthy, J.; Abram, Paul W.; Edwards, Daniel J.; Hart, Timothy P.; and Levin, Michael I."LISP 1.5 Programmer's Manual", MIT Computation Center and Research Laboratory of Electronics 1962. view details
          in Datamation 7(12) Dec 1961 view details
  • Bobrow, Daniel G.; and Raphael, Bertram. "A comparison of list-processing computer languages" RAND Corp., Santa Monica, Calif., RM-384Q-PR, Oct. 1963 view details
          in Datamation 7(12) Dec 1961 view details
  • Lombardi, L. A. review of McCarthy 1962 view details Abstract: According to the author, the main goals of a theory of computation would be: 1) The development of a universal programming language 2) The definition of a theory of equivalence of computation processes 3) The representation of algorithms in a way such that significant changes correspond to simple changes of the representation.

    In this paper the first goal is tackled successfully by means of a language structure developed on the basis of LISP. The conditional expression invented by the author is one of the main elements of such success. The representation of functions and functionals according to this philosophy is very effective. The representation of spaces, on the contrary, is not yet fully satisfactory, mainly because of a difficult interface with the representation of functionals: this aspect needs further analysis. The second goal (a very hard one) is approached by giving a method of proving equivalence by recurswn induction; i.e., showing that the terms of comparison are the solution of the same recurrence relation.

    As far as the third goal is concerned, this philosophy provides a basis for a solution in terms of what nowadays are called declarative or nonprocedural representations, as opposed to imperative, statement-type or procedural programming languages. Today this is the area to which the most significant modern research in computer and language design is devoted.

    This is one of the most stimulating papers ever written in the area of automatic computation. Its contents can be regarded as a substantive exposure of the practical futility and whimsiness of tackling such problems without imagination and without being willing to pay the cost of a deep scientific insight into their very structure, as was done in the case of conventional and shallow approaches such as the ALGOL and COBOL philosophies.
          in ACM Computing Reviews 4(05) September-October 1963 view details
  • Minsky, M.L. A LISP Garbage Collector Algorithm Using Serial Secondary Storage AIM-58 December 27, 1963 view details Abstract: This paper presents an algorithm for reclaiming unused free storage memory cells in LISP. It depends on availability of a fast secondary storage device, or a large block of available temporary storage. For this price, we get: 1.) Packing of free-storage into a solidly packed block. 2.) Smooth packing of arbitrary linear blocks and arrays. 3.) The collector will handle arbitrarily complex re-entrant list structure with no introduction of spurious copies. 4.) The algorithm is quite efficient; the marking pass visits words at most twice and usually once, and the loading pass is linear. 5.) The system is easily modified to allow for increase in size of already fixed consecutive blocks, provided one can afford to initiate a collection pass or use a modified array while waiting for such a pass to occur.
    pdf
          in ACM Computing Reviews 4(05) September-October 1963 view details
  • Timothy P. Hart "MACRO Definitions for LISP" October 1963 AIM-057 view details Abstract: In LISP 1.5 special forms are used for three logically separate purposes: a) to reach the alist, b) to allow functions to have an indefinite number of arguments, and c) to keep arguments from being evaluated. New LISP interpreters can easily satisfy need (a) by making the alist a SPECIAL-type or APVAL-type entity. Uses (b) and (c) can be replaced by incorporating a MACRO instruction expander in define. I am proposing such an expander.
    pdf ps
          in ACM Computing Reviews 4(05) September-October 1963 view details
  • Barbieri, R. Computer List-Processing Languages IBM, Data Systems Division, Poughkeepsie, New York Technical Report No. TR00.1209 November 1964 view details Extract: Sammet summary
    This report describes several list structures and some important aspects of their implementation. It describes and compares FLPL, IPL-V, LISP and SLIP. It also gives examples of the use of slip in manipulating power series.
          in ACM Computing Reviews 4(05) September-October 1963 view details
  • Reynolds, John C. review of McCarthy, J. et al, 1962 view details Abstract: This manual represents the first adequate documentation of LISP 1.5, a programming system for list processing and symbol manipulation. In addition to its historical and theoretical importance, LISP is, in the reviewer's opinion, the most powerful and practically useful list processing system currently available. The specific system described in the manual is implemented on the IBM 7090.

    The theoretical importance of LISP stems from its relation to McCarthy's Theory of Computation [ McCarthy, J. "A basis for a mathematical theory of computation," Computer Programming and Formal Systems, Braffort P. and Hischberg, D., Eds. North-Holland Publishing Co., Amsterdam, 1963, 3~70. ] The major achievement of this theory is the development of an algorithmic notation which is simple enough to allow the construction of proofs about algorithms, yet powerful enough to allow complex algorithms to be expressed easily. In essence LrsP is the concrete realization of this notation as a programming system, and as such it represents a demonstration of the power and completeness of the notation. However the effectiveness of this demonstration is qualified by the existence in LISP of several facilities (particularly pseudo-functions and the program feature) which violate the theoretical notation but appear to be necessary for easy and efficient programming.

    The fundamental construct in LISP is the S-language, which is a concise external representation for list structures. Given a set of atomic symbols, which are alphanumeric strings, an Sexpression is defined as either an atomic symbol or a string of the form (S1 SZ), where a, and sz are also S-expressions. Later an additional type of S-expression with the form (s1, sz, . . ., s,,) is introduced as an abbreviation for (s1 (sz ... (so NIL) . . . ); this type of expression corresponds to a conventional linked list.

    A separate language called the M -language is developed for describing list-processing programs. Beginning with a small set of primitive functions, the set of possible M-expressions is developed by introducing the nested composition of functions, conditional expressions, recursion, and an unusually flexible notation for describing variable binding. The most notable characteristic of the M-language is its emphasis on the use of recursion rather than iteration, and the use of deeply nested functional expressions rather than sequences of imperative statements.

    Finally a set of straight-forward mapping rules are given for translating the M-language into the S-language. It should be emphasized that the program which is presented to the computer is not an M- expression, but rather its translation into an S-expression. Thus the system recaptures at the level of the programming language the essential feature of Von Neumanntype machine languages, that the program language is a subset of the data language. Although this feature is characteristic of several recent programming systems, LISP is original in using the hierarchical nature of its data structures to reflect the nesting of functional expressions in its programs. Practically, this feature provides an elegant and unified method for handling functions which accept other functions as their arguments, or even produce other functions as their results. In addition the ability to describe the LISP interpreter in its own language (in two pages) provides an unusually rigorous documentation.

    Overall, LISP gives a unique demonstration that profound and careful theoretical considerations can lead to major practical advances in the design of a programming system for mathematically sophisticated users. The profoundness of the language has been evident from its earliest versions, but the magnitude of its practical value is fully apparent only in the more developed 1.5 system.

    However, despite its power and flexibility, LISP 1.5 has several significant limitations which reflect some serious theoretical problems in automatic programming and list processing:

    1 ) The correspondence between S-expressions and list structures is imperfect. Thus for example, the use of S-expressions confuses the distinction between two pointers to the same list and two pointers to equivalent lists. Similarly it is difficult to use circular or re-entrant list structures, which lead to infinite S-expressions. Although such structures are not prohibited in LISP, they require cumbersome programming and possess no external representation. An appropriate extension of the data representation, plus a flexible facility for iterating over re-entrant structures would be a major improvement.

    2) Despite their practical value, pseudo-functions (functions with side effects) and the program feature (which allows programs to be defined by statement sequences) appear to be afterthoughts which were grafted onto a conceptually complete system. The basic structure of the M-language suffers from an inadequate unification of the concept of a program composed of sequential steps with the concept of a program composed of nested expressions.

    3) Although they offer a substantial improvement in conciseness over most other notations for list structures, Sexpressions still tend to become unwieldy and difficult to read. In particular, the translation of M-expressions into S-expressions is tedious and error-prone, so that the ability to read-in programs directly as M-expressions would be a significant advantage. A more general problem arises from the difficulty of preparing and reading data in the form of S-expressions. Thus for example in a program for performing algebraic manipulation, the expression am + be + c might be represented by the S-expression (PLUS, (TIMES, A, (EXPT, X, 2)), (TIMES, B. X), C). In this type of representation complex data becomes completely unreadable. Of course the facility for character-by-character input-output provides a basic capability for altering the external data representation, but the necessary programming is substantial. Possibly a more sophisticated and general input-output facility could be developed along the lines of E. T. Irons' "Syntax-Directed Compiler" [ Comm ACM 4, 51 (1961) ] .

    4) The basic structure of lists themselves seems to impose certain limitations and inefficiencies on facilities for numerical data and arrays. Thus for example it is impossible to introduce arrays as elements of lists, or to make any use of the extra bits in list words which are not used for addresses, or to perform arithmetic operations without using two words of free list storage for the result. Difficulties of this sort seem to appear in every list processing system with which the reviewer is familiar. A potential answer may lie in the concept of plex processing which has been introduced by D. T. Ross [ Comm. ACM 4, 147 (1961) ] , but the development of this concept raises difficult problems of automatic storage allocation and recovery.

    Both as a programming primer and as a reference document, the LISP 1.5 manual is clear and reasonably detailed, and represents a substantial improvement over earlier reports on the language. However, a more detailed discussion of the compiling and assembly facilities, as well as the calling conventions for machine-language functions, would be helpful. It should be noted that two separate publications of this manual, by the MIT Computation Center and by MIT Press, appear to be identical in content.

    For a short and elementary introduction to LISP, the paper "Atoms and Lists", [ Woodward, P. M., and Jenkins, D. P., Computer J. 4, 47 (1961) ] is recommended.




          in ACM Computing Reviews 5(03) May-June 1964 view details
  • Samuel, A.L. "Time Sharing on a Multiconsole Computer" MIT-LCS-TR-017 1965 view details
          in ACM Computing Reviews 5(03) May-June 1964 view details
  • Bachman, C. W. review of Bobrow and Raphael 1963 view details
          in ACM Computing Reviews 7(03) May-June 1966 view details
  • Newell, Alan, Jay Earley and Fred Haney "*l manual" Carnegie Institute of Technology June 26, 1967 view details Extract: Conversational languages
    The situation normally faced by the designer of a system is that he has a set of distinct facilities ? such as languages, editors, debuggers, timing packages, and cross-reference programs. Each of these is created with a separate set of linguistic and interactive conventions. There is integration only at the level of the underlying operating system, seen by the user-designer as a single uniform command language. On the other hand, general interactive systems from JOSS (Shaw 1965) onward have adopted the other position by integrating all the facilities with the main language system. LISP, APL and l-* belong in this latter category, along with a less well-known system called LCC (Perlis, Mitchell, and VanZoeren 1965). Though at heart a JOSS-like system, LCC has many of the features of a standard HLL, and thus represents 3 rather unique marriage of HLL constructs with the philosophy of complete integration.
          in ACM Computing Reviews 7(03) May-June 1966 view details
  • Perrottet, M. and Calmet, J. "Sixth-Order Radiative Corrections to the Anomalous Magnetic Moment of the Electron" Physical Review D Volume 3 Number 12 15 June 1971 view details Extract: Introduction
    Introduction

    Some of the contributions from the 72 diagrams contributing to the iy3 part of the anomalous magnetic moment of the electron [a, = &(g, -2)] have already been Recently the measurement of a, has been performed by Wesley and Rich5 with an accuracy of 6 ppm, so that a knowledge of the remaining contributions is urgently needed. One of the main features of this kind of calculation is the amount of algebra involved. A great part of it can only be done reasonably by computer. In fact, in the above-mentioned calculations, computer techniques were broadly used. Two programs devoted to symbolic manipulations were mainly used: the REDUCE system of Hearn, and SCHOONSCHIP, a machine code program developed by Veltman.7 In the method we are reporting, a slightly different method is used, and all the manipulations involved in the calculations are performed by a computer. The program written by one of ,us (J.C.) is intended to be general enough to calculate the contributions of all of the relevant diagrams.
    To do this, we had to develop a suitable method of renormalization and the appropriate calculational techniques. It appeared that the functional formalism and the well-known Feynman method were suitable for our purpose. The formulation of the renormalization theory in the framework of the functional formalism permits us to determine unambiguously the counterterms which have to be subtracted from a diagram to make its contribution finite. This, and the fact that the tensorial dependence of the skeleton divergence of a diagram contributing to a, is in y,, have been used to develop a numerical method of cancellation of the ultraviolet divergences. The main features of this formulation are given in Sec. II.
    Section III is devoted to a survey of the program. Without entering into technical details, we stress its possibilities and limitations. This program is written in the LISP programming language." It takes as input a set of expressions describing a diagram and gives in the output the contributions to a, in terms of multidimensional integrals over the Feynman parameters. It is also possible to compute either the anomalous magnetic moment of the muon (a,) or the difference a, -a, of the electron and muon magnetic moments.
    The integrals in hand are numerically estimated by means of a subroutine due to Scheppey, Dufner, and Lautrup. Section IV contains a brief description of this subroutine and describes how the infrared divergences are removed, again using a numerical method.
    The last section is devoted to the latest results obtained. We have chosen to compute first the diagrams with vacuum polarization insertions.
    While we were compiling our results, a paper by Brodsky and Kinoshita appeared, reporting on the same calculations. Both results are in agreement.
          in ACM Computing Reviews 7(03) May-June 1966 view details
  • Stock, Karl F. "A listing of some programming languages and their users" in RZ-Informationen. Graz: Rechenzentrum Graz 1971 135 view details Abstract: 321 Programmiersprachen mit Angabe der Computer-Hersteller, auf deren Anlagen die entsprechenden Sprachen verwendet werden kennen. Register der 74 Computer-Firmen; Reihenfolge der Programmiersprachen nach der Anzahl der Herstellerfirmen, auf deren Anlagen die Sprache implementiert ist; Reihenfolge der Herstellerfirmen nach der Anzahl der verwendeten Programmiersprachen.

    [321 programming languages with indication of the computer manufacturers, on whose machinery the appropriate languages are used to know.  Register of the 74 computer companies;  Sequence of the programming languages after the number of manufacturing firms, on whose plants the language is implemented;  Sequence of the manufacturing firms after the number of used programming languages.]
          in ACM Computing Reviews 7(03) May-June 1966 view details
  • Sammet, Jean E., "Roster of Programming Languages 1972" 149 view details
          in Computers & Automation 21(6B), 30 Aug 1972 view details
  • Sammet, Jean E. "Roster of Programming Languages for 1973" p147 view details
          in ACM Computing Reviews 15(04) April 1974 view details
  • Stock, Marylene and Stock, Karl F. "Bibliography of Programming Languages: Books, User Manuals and Articles from PLANKALKUL to PL/I" Verlag Dokumentation, Pullach/Munchen 1973 327 view details Abstract: PREFACE  AND  INTRODUCTION
    The exact number of all the programming languages still in use, and those which are no longer used, is unknown. Zemanek calls the abundance of programming languages and their many dialects a "language Babel". When a new programming language is developed, only its name is known at first and it takes a while before publications about it appear. For some languages, the only relevant literature stays inside the individual companies; some are reported on in papers and magazines; and only a few, such as ALGOL, BASIC, COBOL, FORTRAN, and PL/1, become known to a wider public through various text- and handbooks. The situation surrounding the application of these languages in many computer centers is a similar one.

    There are differing opinions on the concept "programming languages". What is called a programming language by some may be termed a program, a processor, or a generator by others. Since there are no sharp borderlines in the field of programming languages, works were considered here which deal with machine languages, assemblers, autocoders, syntax and compilers, processors and generators, as well as with general higher programming languages.

    The bibliography contains some 2,700 titles of books, magazines and essays for around 300 programming languages. However, as shown by the "Overview of Existing Programming Languages", there are more than 300 such languages. The "Overview" lists a total of 676 programming languages, but this is certainly incomplete. One author ' has already announced the "next 700 programming languages"; it is to be hoped the many users may be spared such a great variety for reasons of compatibility. The graphic representations (illustrations 1 & 2) show the development and proportion of the most widely-used programming languages, as measured by the number of publications listed here and by the number of computer manufacturers and software firms who have implemented the language in question. The illustrations show FORTRAN to be in the lead at the present time. PL/1 is advancing rapidly, although PL/1 compilers are not yet seen very often outside of IBM.

    Some experts believe PL/1 will replace even the widely-used languages such as FORTRAN, COBOL, and ALGOL.4) If this does occur, it will surely take some time - as shown by the chronological diagram (illustration 2) .

    It would be desirable from the user's point of view to reduce this language confusion down to the most advantageous languages. Those languages still maintained should incorporate the special facets and advantages of the otherwise superfluous languages. Obviously such demands are not in the interests of computer production firms, especially when one considers that a FORTRAN program can be executed on nearly all third-generation computers.

    The titles in this bibliography are organized alphabetically according to programming language, and within a language chronologically and again alphabetically within a given year. Preceding the first programming language in the alphabet, literature is listed on several languages, as are general papers on programming languages and on the theory of formal languages (AAA).
    As far as possible, the most of titles are based on autopsy. However, the bibliographical description of sone titles will not satisfy bibliography-documentation demands, since they are based on inaccurate information in various sources. Translation titles whose original titles could not be found through bibliographical research were not included. ' In view of the fact that nany libraries do not have the quoted papers, all magazine essays should have been listed with the volume, the year, issue number and the complete number of pages (e.g. pp. 721-783), so that interlibrary loans could take place with fast reader service. Unfortunately, these data were not always found.

    It is hoped that this bibliography will help the electronic data processing expert, and those who wish to select the appropriate programming language from the many available, to find a way through the language Babel.

    We wish to offer special thanks to Mr. Klaus G. Saur and the staff of Verlag Dokumentation for their publishing work.

    Graz / Austria, May, 1973
          in ACM Computing Reviews 15(04) April 1974 view details
  • Bobrow, D.G. and B. Raphael, "New programming languages for artificial intelligence" view details
          in [ACM] ACM Computing Surveys (CSUR) 6(3) September 1974 view details
  • Leavenworth, Burt M.; Sammet, Jean E. "An overview of nonprocedural languages" pp1-12 view details Abstract: This paper attempts to describe some of the basic characteristics and issues involving the class of programming languages commonly referred to as ?nonprocedural? or ?very high level?. The paper discusses major issues such as terminology, relativeness, and arbitrary sequencing. Five features of nonprocedural languages are described, and a number of specific languages are discussed briefly. A short history of the subject is included.

          in Proceedings of the ACM SIGPLAN symposium on Very high level languages, March 28-29, 1974, Santa Monica, California, United States view details
  • Licklider, J.C.R. "USER-ORIENTED INTERACTIVE COMPUTER GRAPHICS" Proceedings of the ACM/SIGGRAPH workshop on User-oriented design of interactive graphics systems Pittsburgh, PA 1976 pp89-96 view details Extract: Graphics Embedded in LISP and APL-Like Languages
    Graphics Embedded in LISP and APL-Like Languages
    Graphics has dominated most of the computer systems (such as systems for computer aided design and systems for the layout of LSI chips) that have made extensive use of interactive graphics, and the languages through which users have interacted with such systems have been primarily graphical languages. The languages in which such systems have been implemented, however, have usually been ordinary programming languages with little or no graphics capability of their own. It seems to me that there is an important middle ground, a place for systems that employ both graphical and alphanumeric input and output and that are operated or controlled through mixed media.
    It seems to me, also, that there is a need for languages that can serve both as programming and interaction languages and that incorporate the essential graphic forms and graphics operations along with the data structures, control structures, and basic operations required in programming systems and in interacting with systems that have been programmed.
    My favorite candidate language for the role just described is a derivative of LISP called "MDL" ("More Data-structures than Lisp"). But instead of pushing it, particularly, I want to point in the general direction of LISP and APL, languages that have implementations with both interpreters and compilers, the interpreters being in fact complete programming systems with built in libraries, documentation subsystems, debugging aids, and so on -- and the compilers being already reasonably efficient in some instances and in principle very amenable to high optimization.
    I think that some characteristics of LISP and APL are of the essence of user orientation, and I think that interactive computer graphics can correct some of the characteristics that are not. Some of the good features of LISP are the simplicity of its data and control structures, the essential modularity of LISP programs (functions), the fact that LISP programs are processible as data by other LISP programs, and the fact that a single program can include both interpreted and compiled parts. Some of the good features of APL are its abundance of array operators and the terseness with which complex procedures can be expressed.
    A few bad features should be acknowledged: LISP has too many parentheses and too many terms that are not properly selfexplanatory, and APL reads backwards, but graphical representation could cure those ills.
    What I advocate, then, is a graphics-augmented LISP-like or APL-like language that can be used both to program and to operate interactive graphics systems.
    For nonprogrammer users, some of the more sophisticated parts of the language would ordinarily be held back, but essentially there would be a single, uniform language for graphics and nongraphics, for programming and operation. The main consideration that forces me to qualify my claims for the value of such a language is that both LISP and APL complications require much further development before they can provide the efficient code required in the final stages of system development. However, as Bruce Daniels is arguing very convincingly [3], the greatest optimization can be achieved with the highest-level language, and the future therefore looks good for efficiently compiled code from such a language as I am advocating.
          in Proceedings of the ACM SIGPLAN symposium on Very high level languages, March 28-29, 1974, Santa Monica, California, United States view details
  • Sammet, Jean E "Roster of programming languages for 1976-77" pp56-85 view details
          in SIGPLAN Notices 13(11) Nov 1978 view details
  • Sandewall, Erik "Programming in an Interactive Environment: the Lisp Experience" pp35-71 view details Extract: Introduction
    INTRODUCTION
    Why do some programming systems have to be large and complex? In recent years there has been a trend towards simple designs in computer science research. The widespread revulsion against OS/360 in the academic community led to a quest for primitive concepts in operating systems and for very simple systems, which have been successfully developed by, for example, Brinch Hansen. Similarly, reaction against large and messy programming languages encouraged the development and adoption of languages that minimized the number of facilities and features, notably PASCAL. I believe that the great attraction of very simple programming languages such as BASIC and very simple database systems such as MUMPS in the world of practical computing are other examples of the same trend towards simplicity. Despite the above, the present paper is concerned with programming systems which by necessity have to be large and complex and which are very hard to structure well because we know so little about their design. Such systems are of interest for a combination of two reasons.

    First, there is a long list of things that one wants a programming system, particularly if it is interactive, to do for the programmer. ("Programming system", is used to mean an integrated piece of software which is used to support program development, including but not restricted to a compiler.) The reader can easily generate his own list of functions, but here are some possibilities:

  • Administration of program modules and of different generations of the same module (when errors are cor- rected and/or the scope of the program is extended);

  • Administration of test examples and their correct results (including side effects), so that the relevant tests are performed automatically or semiautomatically when sections of the pro gram are changed, and a report is made to the user if a discrepancy has been observed;

  • Administration of formal and informal documentation of program segments, and automatic generation of formal documentation from programs;
  • Interdialect translation;
  • Checking of compatibility between parts of programs;
  • Translation from general-purpose or specialized higher-level languages to the chosen base language ("preprocessors"), with appropriate support for compile-time and run-time error diagnostics in the base language, com ments, etc.;
  • Support for a given programming methodology. For example, if top-down programming is to be encouraged, then it is natural to let the interactive programming system maintain successive decomposition steps, and mutual references between an abstract step and its decomposi tion;
  • Support of the interactive session. For example, a history facility allows the user to refer back to previous commands to the system, edit them, and re-execute them. An undo facility  allows the programmer to back up and undo effects of previously performed incorrect commands, thus salvaging the data-structure environment that was interactively created during the interactive debugging session;
  • Specialized editing, performed with
  • an editor that understands at least the syntax of the chosen programming language, and which therefore allows the user to refer to natural entities in this language and to give fairly high-level instructions as to where additions to the program are to be inserted;
  • Optimizing programs which transform a program into an equivalent but more efficient one; Uniform insertion programs, which in a given program systematically insert additional statements, for example for producing trace printouts or for counting how often locations in the program are visited during execution.
  • Second, and this is the crucial point, if these functions are performed by separate and independent programs, a considerable duplication of effort will result. Syntax analysis has to be performed not only by a compiler or interpreter, but also by specialized editors, optimizing programs, uniform insertion programs, documentation generators (such as cross-indexers), and so on. Analysis of the relationships between modules (calling structure, data-flow structure, etc.) is needed for generation of documentation, administration of test examples, compatibility controls, and program optimization. Since the results of an execution count may be used to tell an optimizer where it should spend its efforts, programs for these two tasks should be able to communicate. Also, some of the above facilities, such as the undo facility, are only possible if they are integrated into the programming system. For these reasons, it is natural to try to integrate facilities such as the above into one coherent programming system, which is capable of performing them all in an economic and systematic fashion.

    I believe that the development of integrated, interactive programming systems, and the methodology for such systems, is the major research issue for programming systems and programming methodology today. It is significant for programming methodology, since every detailed recommendation on how to write programs is also a recommendation on how to design an interactive programming system that supports the methodology. In the area of research on programming systems, this is relatively unexplored territory waiting to be considered now that other problems such as compiler design for conventional languages seems to be fairly well understood. The task of designing interactive programming systems is hard because there is no way to avoid complexity in such systems. Because of all the dependencies between different parts of an interactive programming system, it is hard to break up the design into distinct subproblems.

    The only applicable research method is to accumulate experience by implementing a system, synthesize the experience, think for a while, and start over.

    Such systems have been built and used for the programming language LISP. I believe that the time is now ripe for a synthesis and discussion of the experience that has accumulated in this context. The present paper is intended to serve such a purpose.
          in [ACM] ACM Computing Surveys (CSUR) 10(1) March 1978 view details
  • Curtis AC "A comparison of LISP and MUMPS as implementation languages for knowledge-based systems" J Med Syst. 8(5) Oct 1984 pp399-406 view details
          in [ACM] ACM Computing Surveys (CSUR) 10(1) March 1978 view details
  • Norvig, Peter "Paradigms of Artificial Intelligence Programming: Case Studies in Common Lisp" Morgan Kaufmann 1991 view details ftpSource code
          in [ACM] ACM Computing Surveys (CSUR) 10(1) March 1978 view details
  • Lins, R. D.: The Fall and Rise of FP (Invited Paper) view details
          in 3rd International Summer School on Functional Programming Braga PORTUGAL 1998 view details