INTERLISP(ID:957/int017)


Xerox PARC LISP

Descendant of BBN-Lisp. Interlisp includes a LISP programming environment. Dynamically scoped. NLAMBDA functions do not evaluate their arguments. Any function could be called with optional arguments. Interlisp-10 used shallow binding, while Xerox's Interlisp-D used deep binding.

Once INTERLISP was one of two main branches of LISP (the other being MACLISP). In 1981 Common LISP was begun in an effort to combine the best features of both.


Related languages
BBN-LISP => INTERLISP   Renaming
ByteLisp => INTERLISP   Bootstrap implementation
INTERLISP => Common LISP   Evolution of
INTERLISP => GENglish   Written using
INTERLISP => INTERLISP-10   Evolution of
INTERLISP => INTERLISP-D   Evolution of
INTERLISP => MultiLisp   Implementation
INTERLISP => QLISP   Derivation of
INTERLISP => UNITS   Written using

References:
  • Bobrow, D.G. and B. Raphael, "New programming languages for artificial intelligence" view details
          in [ACM] ACM Computing Surveys (CSUR) 6(3) September 1974 view details
  • Teitelman, W. "Interlisp Programming Manual", TR, Xerox Rec Ctr 1974. view details Abstract: INTERLISP (formerly BBN LISP) has evolved from a succession of LISP systems that began with a LISP designed and implemented for the DEC PDP-1 by D. G. Bobrow and D. L. Murphy at Bolt, Beranek and Newman in 1966, and documented by D. G. Bobrow. An upwards compatible version of this LISP was implemented for the SDS 940 in 1967,by Bobrow and Murphy. This system contains the seeds for many  of  the   capabilities and features of the current system: a compatible compiler and interpreter, uniform error handling, an on-line LISP oriented editor, sophisticated debugging facilities, etc. 940 LISP was also the first LISP system to demonstrate the feasibility of using software paging techniques and a large virtual memory in conjunction with a list-processing system. DWIM, the Do-What-I-Mean error correction facility, was introduced into the system    in 1968 by W. Teltelman, who was also responsible for documentation for the 940 LISP system. Extract: Introduction
    INTRODUCTION
    This document Is a reference manual for INTERLISP, a LISP system currently Implemented on the DEC PDP-10 under the BUN TENEX time sharing system. INTERLISP is designed to provide the user access to the large virtual memory allowed by TENEX, with a relatively small penalty in speed (using special paging techniques described elsewhere). Additional data types have been added, including strings, arrays, and hash association tables (hash links) (Sections 7 and 10). The system Includes a compatible compiler (Section 18) and interpreter. Machine code can be Intermixed with INTERLISP expressions via the assemble directive of the compiler. The compiler also contains a facility for "block compilation" which allows a group of functions to be compiled as a unit, suppressing internal names. Each successive level of computation, from interpreted through compiled, to block-compiled provides greater speed at a cost of debugging ease.
    INTERLISP has been designed to be a good on-line interactive system. Some of the features provided include elaborate debugging facilities with tracing and conditional breakpoints (Section 15), and a sophisticated LISP oriented editor within the system (Section 9). Utilization of a uniform error processing through user accessible routines (Section 16) has allowed the Implementation of DWIN, a Do-What-I.-Mean facility, which automatically corrects many types of errors without losing the context of computation (Section 17). The CLISP facility (Section 23) extends the LISP syntax by enabling ALGOL-like infix operators such as +, -, », /. =, *-, AND, OR, etc., as well as IF-THEN-ELSE statements and FOR-WHILE-DO statements. CUSP expressions are automatically converted to equivalent LISP forms when they are first encountered. CUSP also includes list construction operators, as well as a LISP oriented pattern match compiler.
    A novel and useful facility of the INTERLISP system is the programmer's assistant (Section 22), which monitors and records all user inputs. The user can instruct the programmer's assistant to repeat a particular operation or sequence of operations, with possible modifications, or to UNDO the effects of specified operations. The goal of the programmer's assistant, DWIH, CLISP, etc. is to provide a programming environment which will "cooperate" with the user in the development of his programs, and free him to concentrate more fully on the conceptual difficulties and creative aspects of the problem he is trying to solve.
    To aid in converting to INTERLISP programs written in other LISP dialects, e.g.. LISP 1.5, Stanford LISP, we have implemented TRANSOR, a subsystem which accepts transformations (or can operate from previously defined transformations), and applies these transformations to source programs written in another LISP dialect, producing object programs which will run on IMTERLISP (Appendix 1). In addition, TRANSOR alerts the programmer to problem areas that (may) need further attention. TRANSOR was used extensively in converting from 940 LISP to BBN-LISP on the PDP-10. A set of transformations is available for converting from Stanford LISP and LISP 1.9 to INTERLISP.
    A complete format directed list processing system FLIP, is available for use within INTERLISP.
    Although we have tried to be as clear and complete as possible, this document is not designed to be an introduction to LISP. Therefore, some parts may only be clear to people who have had some experience with other LISP systems. A good introduction to LISP has been written by Clark Weissman. Although not completely accurate with respect to INTERLI5P, the differences are small enough to be mastered by use of this manual and on-line interaction. Another useful introduction is given by Berkeley in the collection of Berkeley and Bobrow.

          in [ACM] ACM Computing Surveys (CSUR) 6(3) September 1974 view details
  • Epp, B.: InterLISP Programmierhandbuch, Institut f. Deutsche Sprache, D-6800 Mannheim 1, Maerz 1977 view details
          in [ACM] ACM Computing Surveys (CSUR) 6(3) September 1974 view details
  • Goodwin, J.: A Guided Tour of the InterLISP System, Part I, Univ. i Linkoeping, Tekniska Hoegskolan, LiTH-MAT-R-77-2, Linkoeping, 1977 view details
          in [ACM] ACM Computing Surveys (CSUR) 6(3) September 1974 view details
  • Deutsch, P.: Inside InterLISP: Two Implementations, Xerox PARC, Palo Alto, California, Nov. 26, 1978 view details
          in [ACM] ACM Computing Surveys (CSUR) 6(3) September 1974 view details
  • Sandewall, Erik "Programming in an Interactive Environment: the Lisp Experience" pp35-71 view details Extract: Introduction
    INTRODUCTION
    Why do some programming systems have to be large and complex? In recent years there has been a trend towards simple designs in computer science research. The widespread revulsion against OS/360 in the academic community led to a quest for primitive concepts in operating systems and for very simple systems, which have been successfully developed by, for example, Brinch Hansen. Similarly, reaction against large and messy programming languages encouraged the development and adoption of languages that minimized the number of facilities and features, notably PASCAL. I believe that the great attraction of very simple programming languages such as BASIC and very simple database systems such as MUMPS in the world of practical computing are other examples of the same trend towards simplicity. Despite the above, the present paper is concerned with programming systems which by necessity have to be large and complex and which are very hard to structure well because we know so little about their design. Such systems are of interest for a combination of two reasons.

    First, there is a long list of things that one wants a programming system, particularly if it is interactive, to do for the programmer. ("Programming system", is used to mean an integrated piece of software which is used to support program development, including but not restricted to a compiler.) The reader can easily generate his own list of functions, but here are some possibilities:

  • Administration of program modules and of different generations of the same module (when errors are cor- rected and/or the scope of the program is extended);

  • Administration of test examples and their correct results (including side effects), so that the relevant tests are performed automatically or semiautomatically when sections of the pro gram are changed, and a report is made to the user if a discrepancy has been observed;

  • Administration of formal and informal documentation of program segments, and automatic generation of formal documentation from programs;
  • Interdialect translation;
  • Checking of compatibility between parts of programs;
  • Translation from general-purpose or specialized higher-level languages to the chosen base language ("preprocessors"), with appropriate support for compile-time and run-time error diagnostics in the base language, com ments, etc.;
  • Support for a given programming methodology. For example, if top-down programming is to be encouraged, then it is natural to let the interactive programming system maintain successive decomposition steps, and mutual references between an abstract step and its decomposi tion;
  • Support of the interactive session. For example, a history facility allows the user to refer back to previous commands to the system, edit them, and re-execute them. An undo facility  allows the programmer to back up and undo effects of previously performed incorrect commands, thus salvaging the data-structure environment that was interactively created during the interactive debugging session;
  • Specialized editing, performed with
  • an editor that understands at least the syntax of the chosen programming language, and which therefore allows the user to refer to natural entities in this language and to give fairly high-level instructions as to where additions to the program are to be inserted;
  • Optimizing programs which transform a program into an equivalent but more efficient one; Uniform insertion programs, which in a given program systematically insert additional statements, for example for producing trace printouts or for counting how often locations in the program are visited during execution.
  • Second, and this is the crucial point, if these functions are performed by separate and independent programs, a considerable duplication of effort will result. Syntax analysis has to be performed not only by a compiler or interpreter, but also by specialized editors, optimizing programs, uniform insertion programs, documentation generators (such as cross-indexers), and so on. Analysis of the relationships between modules (calling structure, data-flow structure, etc.) is needed for generation of documentation, administration of test examples, compatibility controls, and program optimization. Since the results of an execution count may be used to tell an optimizer where it should spend its efforts, programs for these two tasks should be able to communicate. Also, some of the above facilities, such as the undo facility, are only possible if they are integrated into the programming system. For these reasons, it is natural to try to integrate facilities such as the above into one coherent programming system, which is capable of performing them all in an economic and systematic fashion.

    I believe that the development of integrated, interactive programming systems, and the methodology for such systems, is the major research issue for programming systems and programming methodology today. It is significant for programming methodology, since every detailed recommendation on how to write programs is also a recommendation on how to design an interactive programming system that supports the methodology. In the area of research on programming systems, this is relatively unexplored territory waiting to be considered now that other problems such as compiler design for conventional languages seems to be fairly well understood. The task of designing interactive programming systems is hard because there is no way to avoid complexity in such systems. Because of all the dependencies between different parts of an interactive programming system, it is hard to break up the design into distinct subproblems.

    The only applicable research method is to accumulate experience by implementing a system, synthesize the experience, think for a while, and start over.

    Such systems have been built and used for the programming language LISP. I believe that the time is now ripe for a synthesis and discussion of the experience that has accumulated in this context. The present paper is intended to serve such a purpose.
          in [ACM] ACM Computing Surveys (CSUR) 10(1) March 1978 view details
  • Teitelman, W. INTERLISP Reference Manual, Xerox, Palo Alto Research Center, Calif, 1978. view details
          in [ACM] ACM Computing Surveys (CSUR) 10(1) March 1978 view details
  • Bobrow, Daniel G. and L. Peter Deutsch "Extending Interlisp for modularization and efficiency" p481-489 view details
          in E.W. Ng (ed) "Symbolic & Algebraic Computation Proceedings of EUROSAM 79" Springer-Verlag, Berlin, 1979 view details
  • Moore, J. Strother II The Interlisp Virtual Machine specification. Technical Report CSL 76-5, Xerox Palo Alto Research Center, March 1979. view details
          in E.W. Ng (ed) "Symbolic & Algebraic Computation Proceedings of EUROSAM 79" Springer-Verlag, Berlin, 1979 view details
  • Koomen, J.A.G.M., "The Interlisp Virtual Machine: A Study of its Design and its Implementation as Multilisp", Master's thesis, University of British Columbia, 1980. view details
          in E.W. Ng (ed) "Symbolic & Algebraic Computation Proceedings of EUROSAM 79" Springer-Verlag, Berlin, 1979 view details
  • Pär Emanuelson, Anders Haraldsson "On compiling embedded languages in LISP" p208-215 view details Abstract: In INTERLISP we find a number of embedded languages such as the iterative statement and the pattern match facility in the CLISP package, the editor and makefile languages and so forth. We will in this paper concentrate on the problem of extending the LISP language and discuss a method to compile such extensions. We propose the language to be implemented through an interpreter (written in LISP) and that compilation of statements in such an embedded language is done through partial evaluation. The interpreter is partially evaluated with respect to the actual statements, and an object program in LISP is obtained. This LISP code can further be compiled to machine code by the standard LISP compiler. We have implemented the iterative statement and a CLISP-like pattern matcher and used a program manipulation system to generate object programs in LISP. Comparisons will be made with the corresponding INTERLISP implementations, which use special purpose compilers in order to generate the LISP code.
          in [ACM] Proceedings of the 1980 ACM Conference on LISP and functional programming 1980, Stanford University view details
  • Dyer, D.; Balzer, B.; Bates, R.; Koomen, H.; Lynch, D.: InterLISP-VAX Pre-Announcement, Inform. Sciences Institute, InterLISP Project, Marina del Rey, July 17, 1981 view details
          in [ACM] Proceedings of the 1980 ACM Conference on LISP and functional programming 1980, Stanford University view details
  • Raymond L. Bates, David Dyer, Johannes A. G. M. Koomen "Implementation of Interlisp on the VAX" pp81-87 view details Abstract: This paper presents some of the issues involved in implementing Interlisp [19] on a VAX computer [24] with the goal of producing a version that runs under UNIX[17], specifically Berkeley VM/UNIX. This implementation has the following goals:
  • To be compatible with and functionally equivalent to Interlisp-10.
  • To serve as a basis for future Interlisp implementations on other mainframe computers. This goal requires that the implementation to be portable.
  • To support a large virtual address space.
  • To achieve a reasonable speed.

    The implemention draws directly from three sources, Interlisp-10 [19], Interlisp-D [5], and Multilisp [12]. Interlisp-10, the progenitor of all Interlisps, runs on the PDP-10 under the TENEX [2] and TOPS-20 operating systems. Interlisp-D, developed at Xerox Palo Alto Research Center, runs on personal computers also developed at PARC. Multilisp, developed at the University of British Columbia, is a portable interpreter containing a kernel of Interlisp, written in Pascal [9] and running on the IBM Series/370 and the VAX. The Interlisp-VAX implementation relies heavily on these implementations. In turn, Interlisp-D and Multilisp were developed from The Interlisp Virtual Machine Specification [15] by J Moore (subsequently referred to as the VM specification), which discusses what is needed to implement an Interlisp by describing an Interlisp Virtual Machine from the implementors' point of view. Approximately six man-years of effort have been spent exclusively in developing Interlisp-VAX, plus the benefit of many years of development for the previous Interlisp implementations.
    Extract: Anecdote
    History of the Project

    A few years ago the research community ceased to consider Interlisp-10 a useful research vehicle because of its limited address space. A search began to provide a new LISP environment powerful enough to support current and future research. There was considerable discussion of abandoning the Interlisp dialect entirely in favor of Maclisp [14], LISP Machine LISP [25], NIL [26], or Common LISP. The choice of LISP dialect would to some ex. tent dictate the choice of hardware. Potentially attractive hardware were the CADR [11] (MIT LISP Machines) and Xerox 1100 Scientific Information Processors (Interlisp-D machines, also known as Dolphins or D0's). Both are personal LISP machines. Also considered were machines not specifically oriented toward LISP. They included the PERQ and the PRIME (both personal machines), as well as the M68000.based personal machines, which were promised to be available "soon." The high cost and unpredictable future of each of these personal machines were strong influences against their selection. The new feature of extended addressing on TOPS.20 was also considered and rejected as the basis for a new LISP implementation on the PDP-10. The DEC VAX computer was selected as the machine to host the new Interlisp for several reasons. It has become an extremely popular machine, especially for universities and research facilities.

    Although each of the alternative hardwares has acquired a user community, none approaches the popularity of the VAX. The VAX family of computers promises to have a long life, to be widely available, to be extensively supported, and to have a wide variety of price and performance ranges. It is anticipated that the family will be extended both up in performance and down in price. All of these characteristics enhance the usefulness and longevity of Interlisp.VAX compared to the alternatives. In June 1980 serious work began on the development and implementation of an Interlisp compatible with the VAX series of computers. Initially, most of the effort was directed at the planning and detailed design of the implementation of various critical parts. By the end of the year, the writing of code specific to Interlisp-VAX was begun. Using the Multilisp system as a template, a new Interlisp kernel was developed in the language C [10]. In parallel, the existing Interlisp compiler was modified to produce VAX code. Both of these tasks were essentially completed by August 1981. Since the beginning of 1981, various parts of the existing Interlisp code have been adapted or rewritten to fit the VAX-UNIX mold. Currently the project is substantially completed. The first release of the Interlisp-VAX system was made publicly available in March 1982.
          in [ACM] Proceedings of the 1982 ACM Conference on LISP and Functional Programming, Pittsburgh, Pennsylvania, United States view details