Pascal (520/pas003)

Cover of book

Wirth's development of the Algol-W proposal 


Teaching language designed by Wirth in reaction to the ALOGL 68 report, in spirit allied with the ALGOL-W report

Named for Blaise Pascal

Featured enumeration types, subranges, sets, variant records, case statement.

Wirth (2001) states that the static pointers came from Hoare, the semantics from Algol-W, the record structures from COBOL.


People:
Structures:
Related languages
ALGOL 68 => Pascal   Influence
ALGOL W => Pascal   Influence
COBOL-61 Extended => Pascal   Incorporated some features of
EULER => Pascal   Evolution of
Hoare super-structured Algol => Pascal   Influence
Pascal => ACL   Augmentation of
Pascal => Actus   Extension of
Pascal => ADLIB   Superset
Pascal => AL   Augmentation of
Pascal => Alphard   Based on
Pascal => BPL   Influence
Pascal => CDC-Pascal   Implementation
Pascal => CHAMIL   Based on
Pascal => CHARM   Influence
Pascal => CLU   Based on
Pascal => COMAL   Influence
Pascal => Concurrent Pascal   Extension of
Pascal => COPAS   Augmentation of
Pascal => ESP   Subset
Pascal => Euclid   Evolution of
Pascal => FLAN   Based on
Pascal => GAL   Representation of
Pascal => HP-PASCAL   Dialect of
Pascal => HSL   Strong, Influence
Pascal => INTERACTIVE   Based on
Pascal => IP Pascal   Augmentation of
Pascal => IPL   Implementation
Pascal => JOSSLE   Influence
Pascal => LEGOS   Extension of
Pascal => LIMBO   Influence
Pascal => LIS   Influence
Pascal => MATRIX PASCAL   Extension of
Pascal => Mesa   Influence
Pascal => MIDAS   Implementation
Pascal => Minnesota Pascal 6000   Extension of
Pascal => MIRA   Augmentation of
Pascal => MODEF   Influence
Pascal => Modula   Evolution of
Pascal => Modula-2   Positive Strong Influence
Pascal => MSL   Influence
Pascal => Newton   Based on
Pascal => OSU APL   Influence
Pascal => PASAMS   Extension of
Pascal => Pascal-   Subset
Pascal => Pascal (Jensen and Wirth)   Evolution of
Pascal => Pascal*   Evolution of
Pascal => PASCAL/11   Extension of
Pascal => Pascal/V   Implementation
Pascal => Pascal+CSP   Extension of
Pascal => Pascal-Linda   Augmentation of
Pascal => Pascal-m   Extension of
Pascal => Pascal-S   Subset
Pascal => PASION   Influence
Pascal => Pasqual   Extension of
Pascal => PASRO   Extension of
Pascal => PASSIM   Extension of
Pascal => Path Pascal   Extension of
Pascal => PLAIN   Incorporates some features of
Pascal => Platon   Augmentation of
Pascal => POLROB   Based on
Pascal => PRAXIS   Extension of
Pascal => PT   Subset
Pascal => RAIL   Extension of
Pascal => Rigel   Extension of
Pascal => S*   Extension of
Pascal => S-Basic   Influence
Pascal => SB-Pascal   Implementation
Pascal => Sequence Pascal   Implementation
Pascal => Simone   Based on
Pascal => SIMPAS   Extension of
Pascal => SQURL   Influence
Pascal => SRL   Influence
Pascal => Stanford Pascal   Implementation
Pascal => Sue   Influence
Pascal => TELOS   Based on
Pascal => TEMPO   Influence
Pascal => Transforma   Based on
Pascal => Turbo Prolog   Influence
Pascal => W2   Based on

References:
  • Sammet, Jean E. "Brief survey of languages used for systems implementation" view details Extract: PASCAL
    Pascal
    A new language Pascal, based on the style of ALGOL 60 but not an extension of it, has been developed which is suitable for writing compilers and for describing (and presumably imp[ementing) operating
    systems [Wirth, 1971a]. Pascal has added those features to ALGOL which are essential for systems programming. More specifically, Pascal contains character handling, structures for data hierarchy, records, files, pointers, and set operators. Input/output is handled by defining two standard file variables input and output such that every Pascal program is regarded as a procedure with these two variables as formal parameters; the actual parameters are primarily specified by the computer installation. Pascal is implemented on the CDC 6000 family with the compiler written completely in Pascal; compilers for other machines are underway. The most interesting use of this language is by Brinch Hansen who uses it to describe and write algorithms for part of a multiprogramming system.
          in [ACM] SIGPLAN Notices 6(10) October 1971 Proceedings of the SIGPLAN symposium on Languages for system implementation 1971, Lafayette, Indiana, United States; October, 1971 view details
  • Wirth, N. "The Programming Language Pascal" view details
          in Acta Informatica 1(1) January, 1971 view details
  • Rosen, S. "Programming Systems and Languages 1965-1975" view details Abstract: In spite of impressive gains by PL/I, Fortran and Cobol remain the languages in which most of the world's production programs are written and will remain so into the foreseeable future. There is a great deal of theoretical interest in Algol 68 and in extensible languages, but so far at least they have had little practical impact. Problem-oriented languages may very well become the most important language development area in the next five to ten years. In the operating system area all major computer manufacturers set out to produce very ambitious multiprogramming systems, and they all ran into similar problems. A number of university projects, though not directly comparable to those of the manufacturers, have contributed greatly to a better understanding of operating system principles. Important trends include the increased interest in the development of system measurement and evaluation techniques, and increased use of microprogramming for some programming system functions. DOI
          in [ACM] CACM 15(07) (July 1972) view details
  • Sammet, Jean E., "Roster of Programming Languages 1972" 206 view details
          in Computers & Automation 21(6B), 30 Aug 1972 view details
  • Wirth, Niklaus "On PASCAL, code generation, and the CDC 6000 computer" Stanford University, Department of Computer Science Report Number: CS-TR-72-257 February 1972 view details Abstract: "PASCAL" is a general purpose programming language with characteristics similar to ALGOL 60, but with an enriched set of program- and data structuring facilities. It has been implemented on the CDC 6000 computer. This paper discusses selected topics of code generation, in particular the selection of instruction sequences to represent simple operations on arithmetic, Boolean, and powerset operands. Methods to implement recursive procedures are briefly described, and it is hinted that the more sophisticated solutions are not necessarily also the best. The CDC 6000 architecture appears as a frequent source of pitfalls and nuisances, and its main trouble spots are scrutinized and discussed.
    pdf
          in Computers & Automation 21(6B), 30 Aug 1972 view details
  • Habermann, A. Nico "Critical Comments on the Programming Language Pascal" pp47-57 view details
          in Acta Informatica 3(1) December 31, 1973 view details
  • Hoare, C.A.R. and Wirth, N. "An axiomatic definition of the programming language PASCAL" pp335-355 view details
          in Acta Informatica 2(4) December, 1973 view details
  • Sammet, Jean E. "Roster of Programming Languages for 1973" p147 view details
          in ACM Computing Reviews 15(04) April 1974 view details
  • Stock, Marylene and Stock, Karl F. "Bibliography of Programming Languages: Books, User Manuals and Articles from PLANKALKUL to PL/I" Verlag Dokumentation, Pullach/Munchen 1973 448 view details Abstract: PREFACE  AND  INTRODUCTION
    The exact number of all the programming languages still in use, and those which are no longer used, is unknown. Zemanek calls the abundance of programming languages and their many dialects a "language Babel". When a new programming language is developed, only its name is known at first and it takes a while before publications about it appear. For some languages, the only relevant literature stays inside the individual companies; some are reported on in papers and magazines; and only a few, such as ALGOL, BASIC, COBOL, FORTRAN, and PL/1, become known to a wider public through various text- and handbooks. The situation surrounding the application of these languages in many computer centers is a similar one.

    There are differing opinions on the concept "programming languages". What is called a programming language by some may be termed a program, a processor, or a generator by others. Since there are no sharp borderlines in the field of programming languages, works were considered here which deal with machine languages, assemblers, autocoders, syntax and compilers, processors and generators, as well as with general higher programming languages.

    The bibliography contains some 2,700 titles of books, magazines and essays for around 300 programming languages. However, as shown by the "Overview of Existing Programming Languages", there are more than 300 such languages. The "Overview" lists a total of 676 programming languages, but this is certainly incomplete. One author ' has already announced the "next 700 programming languages"; it is to be hoped the many users may be spared such a great variety for reasons of compatibility. The graphic representations (illustrations 1 & 2) show the development and proportion of the most widely-used programming languages, as measured by the number of publications listed here and by the number of computer manufacturers and software firms who have implemented the language in question. The illustrations show FORTRAN to be in the lead at the present time. PL/1 is advancing rapidly, although PL/1 compilers are not yet seen very often outside of IBM.

    Some experts believe PL/1 will replace even the widely-used languages such as FORTRAN, COBOL, and ALGOL.4) If this does occur, it will surely take some time - as shown by the chronological diagram (illustration 2) .

    It would be desirable from the user's point of view to reduce this language confusion down to the most advantageous languages. Those languages still maintained should incorporate the special facets and advantages of the otherwise superfluous languages. Obviously such demands are not in the interests of computer production firms, especially when one considers that a FORTRAN program can be executed on nearly all third-generation computers.

    The titles in this bibliography are organized alphabetically according to programming language, and within a language chronologically and again alphabetically within a given year. Preceding the first programming language in the alphabet, literature is listed on several languages, as are general papers on programming languages and on the theory of formal languages (AAA).
    As far as possible, the most of titles are based on autopsy. However, the bibliographical description of sone titles will not satisfy bibliography-documentation demands, since they are based on inaccurate information in various sources. Translation titles whose original titles could not be found through bibliographical research were not included. ' In view of the fact that nany libraries do not have the quoted papers, all magazine essays should have been listed with the volume, the year, issue number and the complete number of pages (e.g. pp. 721-783), so that interlibrary loans could take place with fast reader service. Unfortunately, these data were not always found.

    It is hoped that this bibliography will help the electronic data processing expert, and those who wish to select the appropriate programming language from the many available, to find a way through the language Babel.

    We wish to offer special thanks to Mr. Klaus G. Saur and the staff of Verlag Dokumentation for their publishing work.

    Graz / Austria, May, 1973
          in ACM Computing Reviews 15(04) April 1974 view details
  • Hoare, C. A. R. and Wirth, N. "Addenda and Corrigenda to An Axiomatic Definition of the Programming Language Pascal" pp296-296 view details
          in Acta Informatica 3(3) July 22, 1974 view details
  • Wirth, N., "On the Design of Programming Languages", pp386-393 view details
          in Rosenfeld, Jack L. (Ed.): Information Processing 74, Proceedings of IFIP Congress 74, Stockholm, Sweden, August 5-10, 1974 view details
  • Lecarme, Olivier and Desjardins, Pierre "More Comments on the Programming Language Pascal" pp231-243 view details
          in Acta Informatica 4(3) July 31, 1975 view details
  • Wirth, N., "An Assessment of the Programming Language Pascal" view details
          in IEEE Transactions on Software Engineering, June 1975 view details
  • Rig Associates Inc "Evaluation of CORAL 66, PASCAL, CS-4, TACPOL, CMS-2" Rig Associates Inc Reston Va 18 Nov 76 AD-A037 636/8WC view details
          in IEEE Transactions on Software Engineering, June 1975 view details
  • The Higher Order Language Working Group (HOLWG) Working Paper on 23 exisitng programming languages view details
          in IEEE Transactions on Software Engineering, June 1975 view details
  • Wichmann, B. A. "Ackermann's function: a study in the efficiency of calling procedures" BIT 16 (1976), pp103-110 view details Abstract: A six line recursive procedure is used to assess the efficiency of the procedure calling mechanism in ALGOL-like languages. The results from some 40 systems varying from ALGOL 68 and PL/I to System Implementation Languages for minicomputers are presented and compared. A hundred to one variation in performance occurs with this test, the major reasons for which are given.
    Extract: Introduction
    Introduction
    There are several areas where traditional high-level languages are notably less efficient than hand-coding. Two of these areas are accessing array elements and calling procedures. Although in the former case much research has been done to improve the efficiency (resulting in the ALCOR and FORTRAN H2 compilers), little work has been published on calling procedures. In many scientific computations procedures are not called very frequently but this is not true in non-numeric work. Since modularisation by means of procedures is one of the most powerful programming tools in structuring complex problems, an inefficient calling mechanism can be very detrimental. In recent years there has been an increased use of high-level languages as assembler replacements ? the so-called 'System Implementation Languages'. Such languages must achieve a high degree of object code efficiency if their use is to become more widespread. The purpose of this paper is to compare the procedure calling mechanism of traditional high-level languages with both system implementation languages and machine-code.
    Extract: Ackermann's function
    Ackermann's function
    In view of the paper by Sundblad [1], a reasonable test case for study seemed to be the evaluation of Ackermann's function. This function has the advantage of being very short and not requiring large integer values for typical cases, yet being complex enough to be a significant test. The figures given in Sundblad's paper also provided a basis for comparison before many values had been obtained.
    Following Sundblad, we calculate Ackermann (3,n) for increasing n, in accordance with the complete program listing below. Three figures are determined from the test, the average execution time per call, the number of instructions executed per call and the amount of stack space required for each call. Although the average time per call can be determined from running the program, the other two figures cannot be found so easily and indeed, in many cases the figures are not available.

    [1]Y. Sundblad, The Ackermann function. A theoretical, computational and formula manipulative study. BIT 11 (1971), 107119.


    Extract: Acknowledgements
    Acknowledgements
    This work has been inspired by the International Federation for Information Processing Working Group 2.4. The desire of the group to obtain information on the performance of system implementation languages has led to tests such as this. It would not have been possible to write this paper without the active help of the following persons in running the test: ? Mr. L. Ammeraal (Mini-ALGOL 68), Dr R. Backhouse (B5500), Dr J. G. P. Barnes (RTL/2), Dr D. A. Bell (PL516), Dr H. Boom (Cyber 73 ALGOL 68), Mr P. Klint (C-UNIX), Mr R. Conradi (MARY, CYBER 74, 1108, CDC3300), Mr W. Findlay (PASCAL 1906A & machine code), Mr W. B. Foulkes (PASCAL 370/158). Professor G. Goos (B6700), Mr. V. Hath-way (BCPL MOD 1), Mr M. Healey (PL/I OPT), Professor J. J. Horning (Sue 11), Mr B. Jones (ALGOL 60, Dclft 370/165), Dr M. MeKeag (PASCAL & ALGOL 60, 1906S), Mr Z. Mocsi (R10), Dr J. Palme (DEC10 SIMULA & machine code), Mr M.J. Parsons (PALGOL), Dr M. Richards (BCPL), Professor J. S. Rohl (Manchester 1900 ALGOL), Mr P. D. Stephens (IMP), Dr P. Wetherall (ALGOL 68-R), Professor N. Wirth (PASCAL), Professor D. B. Wortman (370/165 machine code) and Professor W. A. Wulf (Bliss 10, Bliss 11 & machine code).
    Extract: Results of tests (altered to include authors where known)



        


































































































































































































































































































































    Language
    Computer
    Time per call (ms)

    Instructions per call

    Words per call

    Characteristic (see below)

    Written by (if stated)

    ALGOL 60

    B6700

    41.2

    16

    13

    ND2VO

    Professor G. Goos

    ALGOL 60

    B5500 Mk XV.l.01

    135

    19.5

    7

    NL2VO

    Dr R. Backhouse

    ALGOL 60

    EMAS 4/75

    46.7

    21

    18

    ND2VO

     

    ALGOL 60

    1906A Manchester

    29.2

    33.5

    30

    ND2VR

    Professor J. S. Rohl

    ALGOL 60

    KDF9 Mk2

    532

    68.5

    10

    CD2VR

     

    ALGOL 60

    1906S XALV

    70.9

    (120)

    13

    CD2TR

    Dr M. McKeag

    ALGOL 60

    370/165 Delft

    43.8

    (142)

    ?

    CD2TR

    Mr B. Jones

    ALGOL 60

    NU 1108

    175

    (175)

    9

    CD2TR

    Mr R. Conradi

    ALGOL W

    360/67 Mk2

    121

    (74)

    (16-45)

    HD2TR

     

    IMP

    ICL 4/75

    46

    17.5

    18

    ND2VO

    Mr P. D. Stephens

    SIMULA

    1108

    120

    (120)

    9

    HD2TR

    Mr R. Conradi

    SIMULA

    DEC1O(KI1O)

    317

    (158)

    ?

    HD2TR

    Dr J. Palme

    SIMULA

    CYSEN 74

    380

    (800)

    (15)

    HD2TR

    Mr R. Conradi

    ALGOL 68

    1907F (no heap)

    134

    28

    15

    NO2VO

    Mr. L. Ammeraal

    ALGOL 68

    1907F (heap)

    167

    34

    15

    HD2VR

    Mr. L. Ammeraal

    ALGOL 68

    CDC 6400(subset)

    45.3

    51

    ?

    HD2VO

    Dr P. Wetherall

    ALGOL 68

    Cyber 73 vl.0.8

    67.8

    (60)

    7

    HD2VR

    Dr H. Boom

    Bliss 10

    DEClO(KAlO)

    53.15

    15

    5

    NLWVR

    Professor W. A. Wulf /Mr Z. Mocsi

    PL/I

    360/65 OPT v1.2.2

    101

    (61)

    68

    HD2AO

    Mr M. Healey

    PL/I

    360/65 F v5.4

    351

    (212)

    (70)

    HD2AR

     

    PASCAL

    1906S

    19.1

    32.5

    11

    HDWVR

    Professor N. Wirth/Mr W. Findlay/Dr M. McKeag

    PASCAL

    CDC 6400

    34

    38.5

    6

    HDWVO

    Professor N. Wirth

    PASCAL

    370/158

    39

    42.5

    30

    HDWVE

    Professor N. Wirth/Mr W. B. Foulkes

    RTL/2

    4/70

    46

    14.5

    14

    NLWVO

    Dr J. G. P. Barnes

    RTL/2

    PDP11/20

    (107)

    30.5

    ?

    CLWVH

    Dr J. G. P. Barnes

    PALGOL

    PDP11/20

    (46)

    13

    3

    NLWVO

    Mr M.J. Parsons

    Bliss/11

    PDP11/20 OPT

    31

    8

    2

    NLWVO

    Professor W. A. Wulf /Professor J. J. Horning

    MARY

    SM4

    105

    30.5

    9

    COWVR

    Mr R. Conradi

    CORAL 66

    MOD 1

    (21)

    15.5

    3

    NLWVO

     

    PL516

    DDP516

    84.5

    37

    2

    CLWVH

    Dr D. A. Bell

    C (UNIX)

    PDP 11/45

    50.4

    26

    ?

    NLWVR

    Mr P. Klint

    BCPL

    370/165

    5.9

    19

    9

    NLWVR

    Dr M. Richards

    BCPL

    PDP 11/45

    48

    20.5

    7

    NLWVO

    Dr M. Richards

    BCPL

    MOD 1

    (35)

    25

    7

    NLWVR

    Dr M. Richards/Mr. V. Hathaway

    Machine code

    Most machines

    ?

    5-14

    1-2

    NLWVO

    Dr J. Palme/Professor D. B. Wortman /Professor W. A. Wulf
    Extract: Program listing
    Program listing
    begin
    integer i, j, k, k1; real t1, t2;
    integer procedure Ackermann(m, n); value m, n; integer m, n; Ackermann := if m = 0 then n + l
    else if n = 0 then Ackermann(m -1,1) else Ackermann(m - 1, Ackermann(m, n - 1));
    k:= 16; k1 := 1; for i := 1 step 1 until 6 do begin
    t1 := cputime;
    j : = Ackermann(3, i);
    t2 := cputime;
    if j = A;-3 then
    outtext(1, 'WRONG VALUE'); outreal(1, t2 ? t1);
    comment Now output time per call;
    outreal(1, 3 x (t2 - t1)/(512 x k1 - 15 x k + 9x i + 37))); k1 := 4 x k1; k := 2 x k end end
    Extract: Properties of the algorithm
    Properties of the algorithm
    To determine the execution characteristics of the test one needs the following properties of the algorithm (which can be easily proved by induction). The number of executions of each leg of the algorithm in calculating Ackermann (3, n) is:
    first leg:          (64x4|n-    72x2|n + 6xn + 26)/3
    second leg:       (                     24 x 2 | n - 3 x n - 12)/3
    third leg:         (64x4|n-    72x2|n + 6xn + 23)/3
    total
    no. of calls =   (128 x4|n-   120x2|n + 9xn + 37)/3
    Hence for large n, the average number of instructions per call is half the number of instructions in the first and third legs. Both legs contain the procedure entry and exit overheads and the third leg includes two unsuccessful tests for equality with zero. So the number of instructions executed per call can be found by examining the machine code produced by the compiler. Unfortunately this is not always easy and sometimes almost impossible. To provide an approximate comparative value, an estimate has been made from the time using a Gibson mix value [2] to extrapolate from a similar machine. In a few cases, the instructions executed arc known but the time has not been measured. In these cases, an estimate for the time is given based upon published instruction times.
    The amount of stack space required by the program is determined by the maximum depth of procedure calls. This is given by Sundblad as
    Ackermann(3, n) - 1 = 2 | (n + 3) - 4
    and occurs when the third leg is being evaluated at all but the last of those levels. Hence the amount of stack space required doubles when n is increased by one. For a more detailed discussion on the storage requirements and the definition of the 'maximum depth of recursion' see [5]. To measure this space, Sundblad observed the maximum value of 11 for which Ackermann (3, n) could be calculated using 26K words of store (which he called the 'capability'). Although this measurement is easy to take, it only provides a rough estimate of the space needed per call. Examination of the machine code of the third leg of the algorithm gives the precise value. In fact, the number of words per call is the difference in the address of 4m in the inner call to the address of the parameter in the routine. It is not easy to calculate this value from the code, and so a post-mortem print may be the easiest way to find the value. Rather than give the capability, the number of words per call is listed with the results. If only the capability is known, bounds can be given for the number of words per call by assuming that between 3K and 10K words are needed out of the 26K store for the program and run-time system.
    Extract: Notes on the results and Factors influencing the execution speed
    Notes on the results
    Estimates of missing values are included in the table in brackets and have been calculated in the manner described above. The program listing in all cases has followed the coding given very closely. The only exceptions are 1) the machine code examples, 2) the PASCAL and SUE systems which have no conditional expressions, and 3) PL516 which follows the hardware of the machine in not supporting recursion (stacking is performed by subroutines).

    Factors influencing the execution speed
    Factors influencing the call of a recursive procedure vary from inherent problems with the architecture of the machine to purely software problems on the design of the procedure calling mechanism. Against each of the results above is a sequence of letters and digits which describes the principle characteristics governing the execution performance.
    Recursion. On most modern machines with a few base registers and indexing facilities, the basic overhead of recursion is very small indeed. A few minicomputers do not have adequate base registers or addressing facilities to support recursion. The Honeywell DDP5l6 is of this type, hence with the high-level assembler PL516 stacking must be performed explicitly. On some machines, although the addressing logic is available an additional time penalty occurs by addressing via pointers. On the Modular 1 implementation of CORAL 66, recursion is an option. In fact the code produced with recursion is more efficient than that without recursion. The reason for this is that the short address length of the Modular 1 means that base registers must be loaded to access the variables of another procedure. This is a larger overhead than that incurred using a single stack register which only needs to he incremented and decremented by a small amount. Without recursion, the number of instructions is increased to 19.5 (30% more).
    Storage allocation. In order to execute this program, a stack storage scheme is necessary. It is sometimes possible on conventional hardware to use an area in the store as stack without performing an explicit software check for stack overflow. One can then rely upon a hardware address violation which should permit the output of a suitable diagnostic. Such systems are marked by an N, whereas C denotes a software stack overflow check. Languages such as ALGOL 68, PL/I and SIMULA require more than a simple stack and hence must perform an overflow check in a manner which allows a recovery to perform some garbage collection. Systems like this arc marked with an H. PASCAL permits additional storage apart from the stack but without garbage collection. Although no garbage collection is involved, PASCAL is marked with an H. Only SIMULA requires an actual garbage collection during the execution of this test. The method used to administer the stack is partly a matter of language design and partly a matter of implementation. For instance, although ALGOL 68 requires a heap, the ALGOL 68-R implementation will run a stack only if the program does not require the heap. The difference, shown above, is that an additional subroutine is called on procedure entry to check that adequate stack space is available.
    Complete display. Block structured languages in general require a base pointer for every block whose variables can be accessed at the relevant part of the program. This facility is marked with a D. Languages such as BCPL and Burroughs B5500 ALGOL restrict access to local and global identifiers only, permitting a significant economy in pointers. These languages are marked with an L. The actual method of implementing the display can vary independently of the language. For instance, ALGOL W and IMP have every base pointer in a register, the PASCAL implementations above backchain the display (from the local procedure) and ALGOL 68-R keeps a copy in core of the full display in the local block. In almost all cases, there will be small variations in the calling overhead depending upon the relevant positions (in the display) of the call and the procedure.

    Dynamic stack storage. In ALGOL 60, the sizes of arrays are not, in general, known until program execution. This facility, implemented with second order working store, requires a small overhead on the administration of stack storage even when no arrays are being used, as in this test. Languages requiring such storage are marked with a 2 whereas systems allowing only storage whose size can be determined at compile time are marked with a W. PALGOL and RTL/2 are LW languages and hence require only one stack pointer (and one pointer for static storage depending upon the addressing structure).
    Parameter handling. The most expensive mechanism is marked with a T which denotes a 'thunk' that is required to implement the ALGOL 60 call by name [4]. A thunk can be avoided with this test by performing all parameter checking at compile time and using only the value mechanism (V). The KDF9, B5500 and Atlas ALGOL compilers all use the value mechanism. The use of the thunk mechanism by the other ALGOL 60 compilers is caused by the problem of handling the call of formal procedures whose parameters cannot be specified [3]. With value parameters, the ICL 1900 ALGOL compiler uses a simplified thunk mechanism but the Manchester ALGOL compiler uses the value mechanism. The PL/I systems use call by address which is marked with an A. With recursion, the addresses of parameters are dynamic and in consequence this method is less efficient than call by value.
    Open code or subroutines. Systems which handle this test entirely by open code are marked with an O, whereas the use of subroutines is marked with an R. The PL/I optimising compiler generates a subroutine call, but it is not invoked unless insufficient stack space is given (which did not happen in this
    case).
    Extract: Conclusion
    Conclusion.
    Ackermann's function provides a simple method of comparing the efficiency of the procedure calling mechanism of a language permitting recursion. The results show a very wide variation in performance even for languages containing no inherent complications. Additional instructions required in ALGOL 68, PL/I and PASCAL to check for stack overflow are quite insignificant compared to the hundreds of extra instructions executed by the inefficient implementations of ALGOL 60. There is no doubt that 'System Implementation Languages' give very much better results on this test without reducing the facilities to the programmer. Machine independence seems to be realised in this case without any measurable cost as BCPL shows.
    Does Ackermann s function represent a good test for a system implementation language? Unfortunately no statistical information is available to the author on the use of procedures in operating systems and compilers etc. Hence it is not known if, for instance, two parameters is typical. The large amount of stack space used is certainly not typical and can result in pathological situations. For instance, stack space is claimed in 64 word units under George 3 on a 1906A, but is not released. Hence during the first part of the algorithm when the stack is being increased a large operating system overhead occurs. During the second part when the stack is rarely increased beyond its previous maximum, there is no significant overhead. The computational part of testing for the equality with zero, jumping and adding or subtracting one seems very typical of non-numeric work. On the 360 computer, the fact that the the algorithm is very short ( <4K bytes) results in a small saving, but on the IMP compiler which can take advantage of this the speed was only the increased by 3%.

    From the better figures produced by system implementation languages, the code for Ackermann is roughly divided as follows:
    instructions
    subroutine entry and exit      2 stacking return address        1 setting up environment        3 checking for stack overflow     2 (if check made) restoring old environment      3 (on procedure exit) setting parameters          2x1=2
    body of Ackermann         8
    Total                   21

          in IEEE Transactions on Software Engineering, June 1975 view details
  • Edwards, R. "Is PASCAL a logical subset of ALGOL 68 or not? I." view details
          in Proceedings of the Strathclyde ALGOL 68 conference Glasgow, Scotland 1977 view details
  • Sammet, Jean E "Roster of programming languages for 1976-77" pp56-85 view details
          in SIGPLAN Notices 13(11) Nov 1978 view details
  • Tanenbaum, AS "A comparison of PASCAL and ALGOL 68" view details
          in The Computer Journal 21(4) view details
  • Steensgaard-Madsen, Jòrgen "Pascal - Clarifications and Recommended Extensions" pp73-94 view details
          in Acta Informatica 12(1) June 25, 1979 view details
  • Wasserman, A. I. "Testing and Verification Aspects of Pascal-Like Languages" view details
          in Computer Languages 4(3-4) view details
  • Addyman, A. M. (1980). "A draft proposal for PASCAL." view details
          in SIGPLAN Notices 15(04) April 1980 view details
  • Addyman, A. M. (1980). "PASCAL standardisation." view details
          in SIGPLAN Notices 15(04) April 1980 view details
  • Leeand G. "DOPLs: a new style of programming" pp176-182 view details
          in The Computer Journal 25(2) May 1982 view details
  • Wichmann, BA "A comparison of Pascal and Ada" pp248-252 view details
          in The Computer Journal 25(2) May 1982 view details
  • Dencker, Peter; Dürre, Karl; Heuft, Johannes "Optimization of parser tables for portable compilers" pp546-572 view details
          in TOPLAS 6(4) October 1984 Lecture Notes in computer science Vol. 174 view details
  • Lima, I.G. "Programming Decentralised Computers" Department of Computing Science, University of Newcastle upon Tyne, 1984 view details External link: Online copy
          in TOPLAS 6(4) October 1984 Lecture Notes in computer science Vol. 174 view details
  • Wirth, Niklaus "Pascal and its Successors" view details Abstract: The programming language Pascal was designed in 1969 in the spirit of Algol 60 with a concisely defined syntax representing the paradigm of structured programming. Seven years later, with the advent of the micro computer, it became widely known and was adopted in many schools and universitites. In 1979 it was followed by Modula-2 which catered for the needs of modular programming in teams. This was achieved by the module construct and the separate compilation facility. In an effort to reduce language complexity and to accommodate object-oriented programming, Oberon was designed in 1988. Here we present some aspects of the evolution of this family of programming languages

    External link: Online copy Extract: Pascal, 1968-1972
    Pascal, 1968-1972
    Freed from the constraining influence of a working group's consensus, Wirth developped the language Pascal in Zurich. The basis was Algol-W and the desire to have a language that would satisfy the requirements of system design (compilers, operating systems, etc.). Also, there was to be a basis of clear concepts and structures, definable axiomatically and independently of any particular computer, as the language was to be suitable also for teaching in an academic environment. Pascal has satisfied these requirements; it is today one of the most widely used languages in computer science education. The first Pascal compiler was designed in Zurich for the CDC 6000 computer family, and it became operational in 1970. Already in 1972 Pascal was used in introductory programming courses. Extract: 0. Introduction
    0. Introduction
    Many times I have been asked how one "invents" a programming language. One cannot really tell, but it certainly is a matter of experience with the subject of programming, and of careful deliberation. Sometimes I answered: "Like one designs an airplane. One must identify a number of necessary building blocks and materials, and then assemble and combine them properly to a functioning whole". This answer may not be entirely satisfactory, but at least in both cases the result either flies or crashes.

    Programming languages were one of the first topics that established computing science as a discipline with its own identity. The topic belonged neither to mathematics nor to electrical engineering. It was Algol 60 that introduced rigor and precision to the subject through its formal definition of syntax. A flurry of activities began in academia to investigate language properties, to find faults and inconsistencies, to devise powerful algorithms of syntax analysis, and to cope with the challenges of compilers. Soon the range of application of Algol was felt to be too narrow. A new, better language was required, perhaps a successor to Algol. Committees were established and hot controversies raged, some protagonists dreaming of grandiose formal systems, some thinking more modestly of a practical improvement. It was this environment that bred Pascal.


    Extract: 1. Structured Programming and Pascal
    1. Structured Programming and Pascal
    Pascal was born in 1969 out of an act of liberation [0]. In more than one sense. Confronted with the duty to teach programming, I had been faced with the dire options of Fortran and Algol. The former did not appeal to my taste as a scientist, the latter not to those of the practical engineer. I liberated myself from this jail by designing Pascal, convinced that an elegant style and an effective implementation were not mutually exclusive. I felt strongly -- and still do -- that a language used in teaching must display some style, elegance, consistency, while at the same time also reflecting the needs (but not necessarily bad habits) of practice. I wanted to design a language for both my classroom and my "software factory".

    The second alluded liberation was from the design constraint imposed by committee work. In 1966, Algol W [1] had been a compromise bowing to many divergent opinions and requests from both an Algol committee and an Algol community. Surely, many of them were inspiring and beneficial, but others were incompatible and hindering. Some members had high ambitions of creating a language with novel features whose consequences were to be the subject of further research, whereas I had been brought up as an engineer feeling uneasy with proposals whose realization was still the subject of speculation. I wanted to have at least a concrete idea of how a construct was to be represented on available computers, and these, let me add, were rather ill-suited for any feature not already present in Fortran.

    The general idea dominating the design of Pascal was to provide a language appealing to systematic thinking, mirroring conventional mathematical notation, satisfying the needs of practical programming, and encouraging a structured approach. The rules governing the language should be intuitive and simple, and freely combinable. For example, if x+y stands for an expression, x+y should be usable as a sub expression, in assignments, as procedure parameter, or as index. For example, if a widely established convention interprets x-y-z to mean (x-y)-z, we should not redefine this to denote x-(y-z). Or if x=y is used for centuries to denote equality of x and y, we should refrain from the arrogance of replacing it by x==y. Clearly, Pascal was to build up on the notational grounds established by mathematics and Algol. Pascal and its successors were therefore called Algol-like.

    Today, it is hard to imagine the circumstances prevailing in the 1960s. We must recall that the computing community was strictly split into two professional camps. The scientists and engineers used Fortran for their programming large-scale, word-oriented, binary computers, wheres the business community used Cobol for their smaller, character-oriented, decimal machines. System programmers were labouring within computer companies using proprietary machine-code assemblers. There were attempts to unite the two worlds, such as the highly innovative Burroughs B-5000 computer, or IBM's programming language PL/I. Both were ill-fated and devoured considerable budgets. Pascal was another such attempt, although less ambitious and without budget or industrial support. It applied the idea of recursively defined structures not only to executable statements, but also to data types. As elements, it adopted arrays (vectors, matrices) from Fortran and Algol, as well as records and files from Cobol. It allowed them to be freely combined and nested.

    The other fact about the 1960s that is difficult to imagine today is the scarcity of computing resources. Computers with more than 8K of memory words and less than 10us for the execution of an instruction were called super-computers. No wonder it was mandatory for the compiler of a new language to generate at least equally dense and efficient code as its Fortran competitor. Every instruction counted, and, for example, generating sophisticated subroutine calls catering to hardly ever used recursion was considered an academic pastime. Index checking at run-time was judged to be a superfluous luxury. In this context, it was hard if not hopeless to compete against highly optimized Fortran compilers.

    Yet, computing power grew with each year, and with it the demands on software and on programmers. Repeated failures and blunders of industrial products revealed the inherent difficulties of intellectually mastering the ever increasing complexity of the new artefacts. The only solution lay in structuring programs, to let the programmer ignore the internal details of the pieces when assembling them into a larger whole. This school of thought was called Structured Programming [2], and Pascal was designed explicitly to support this discipline. Its foundations reached far deeper than simply "programming without go to statements" as some people believed. It is more closely related to the top-down approach to problem solving.

    Besides structured statements, the concept of data types characterized Pascal profoundly. It implies that every object, be it a constant, a variable, a function, or a parameter has a type. Data typing introduces redundancy, and this redundancy can be used to detect inconsistencies, that is, errors. If the type of all objects can be determined by merely reading the program text, that is, without executing the program, then the type is called static, and checking can be performed by the compiler. Surely errors detected by the compiler are harmless and cheap compared to those detected during program execution in the field, by the customer. Thus static typing became an important concept in software engineering, the discipline emerging in the 1970s coping with the construction of large software complexes.

    A particularly successful concept was the integration of pointers into static typing as suggested by Hoare [3] and adopted in Pascal. The simple idea is to attribute a fixed type not only with every data object, but also with every pointer, such that a pointer variable can at any time only refer to an object of the type to which it is bound (or to none at all). Programming with pointers, then called list processing, notoriously fraught with pitfalls, now became as safe as programming without pointers.
    Yet, Pascal also suffered from certain deficiencies, more or less significant depending on personal perception and application. One of them had its origin in a too dogmatic interpretation of static typing, requiring that the type of every procedure parameter be known at compile-time. Since this included index bounds in the case of array types, the frequently convenient dynamic arrays were excluded. In hindsight, this rigidity was silly and kept many Algolites from adopting Pascal. Arrays are typically passed by a reference, and for dynamic arrays only the array bounds must be added to this information. The limited additional complexity of the compiler would certainly have been outweighed by the gained language flexibility.
    Extract: 2. Modular Programming and Modula-2
    2. Modular Programming and Modula-2
    With various defects clearly identified and new challenges in programming emerging, time seemed ripe for a fresh start, for a successor language. The two foremost novel challenges were multiprogramming and information hiding. For me personally, a third, quite practical challenge became an ambition: To create a language adequate for describing entire systems, from storage allocator to document editor, from process manager to compiler, and from display driver to graphics editor. I perceived that many problems in software development stemmed from the mixing of parts written in different languages. The challenge became real within our project to design and build the workstation Lilith in 1978 [6]. Its precursor was Xerox PARC's pioneering workstation Alto [5]. The Alto's software was mostly written in Mesa; Lilith's software entirely in Modula-2. It would have been prohibitive to implement more than one language. Evidently, Modula was born out of an act of necessity [7].

    The cornerstone of Modula-2 was the module construct. Whereas Pascal had served to build monolithic programs, Modula-2 was suitable for systems consisting of a hierarchy of units with properly defined interfaces. Such a unit was called module, and later package in Ada. In short, a module is like a Pascal program with the addition of an explicit interface specification to other modules. This is obtained as follows: Modules are described by two, distinct texts, a definition part and an implementation part. In the former all objects are defined which are visible by other modules, typically types and procedure signatures. They are said to be exported. The latter part contains all local, hidden objects, and the bodies of procedures, i.e. their implementations. Hence the term information hiding. The heading contains lists of identifiers to be imported from other modules.

    [...]

    This key feature catered for the urgent demands for programming in teams. Now it became possible to determine jointly a modular decomposition of the task and to agree on the interfaces of the planned system. Thereafter, the team members could proceed independently in implementing the parts assigned to them. This style is called modular programming. The concept of module arose earlier in work by Parnas and, in conjunction with multi-programming by Hoare and Brinch Hansen, where the module construct was called a monitor [8, 9]. The module was also present in a concrete form in Mesa, which in Modula was simplified and generalized.

    The module construct would, however, have remained of mostly academic interest only, were it not for the technique of separate compilation, which was from its inception combined with the module. By separate compilation we understand that (1) full type checking is performed by the compiler not only within a module, but also across module interfaces, and (2) that compatibility (or version) checking between modules to be joined is achieved by a simple key comparison when the modules are linked and loaded. We refrain from expounding technical details, but emphasize that this is a crucial requirement for the design of complex systems, yet still poorly handled by most systems of commercial provenience.

    Besides the successful feature of the module with separate compilation, the language also had some drawbacks. Surely, the evident deficiencies of Pascal had been mended. The syntax was now unambiguous, type specifications complete, and the set of basic data types adequately comprehensive. But as a result, the language, and with it the compiler, had become relatively large and bulky, although still orders of magnitude less so than comparable commercial ventures. The goal of making the language powerful enough to describe entire systems was achieved by introducing certain low-level features, mostly for accessing particular machine resources (such as I/O device registers) and for breaching, overriding the rigid type system. Such facilities, like e.g. type casting, are inherently contrary to the notion of abstraction by high-level language, and should be avoided. They were called loopholes, because they allow to break the rules imposed by the abstraction. But sometimes these rules appear as too rigid, and use of a loophole becomes unavoidable. The dilemma was resolved through the module facility which would allow to confine the use of such "naughty" tricks to specific, low-level server modules. It turned out that this was a naive view of the nature of programmers. The lesson: If you introduce a feature that can be abused, then it will be abused, and frequently so! Extract: 3 Object-oriented Programming and Oberon
    3 Object-oriented Programming and Oberon
    The advent of the personal computer around 1980 changed the way in which computers were used dramatically. Direct, immediate interaction replaced remote access and batch processing. User interfaces became an important issue. They were shaped by the novel mouse and the high-resolution display, replacing the 24 lines by 80 character screens. They established the paradigm of windows and multi-tasking. It had been pioneered by the Xerox Alto workstation, and in particular the Smalltalk project [10]. Along with it came the object-oriented style of programming. Object-oriented design emerged from the specialized subject of discrete event simulation and its language Simula [11], whose authors Dahl and Nygaard had realised that its concepts had a scope far beyond simulation. Some of the proponents of object-oriented programming even suggested that all of programming should be converted to the new view of the world.

    We felt that a revolution was undesirable to cure the lamented ills of the software profession, and we considered evolution as the wiser approach. Tempted to design a version of Modula stripped down to essentials, we also wanted to identify those features that were indispensable to encompass object-orientation. Our findings were revealing: A single feature would suffice, all other ingredients were already present in Modula. The one feature to be added had to allow the construction of data type hierarchies, called sub-classing in Smalltalk. Our own term was type extension: The new type adds attributes to those of the old type. Type extension had the welcome side effect of practically eliminating all needs for loopholes.

    The absence of loopholes is the acid test for the quality of a language. After all, a language constitues an abstraction, a formal system, determined by a set of consistent rules and axioms. Loopholes serve to break these rules and can be understood only in terms of another, underlying system, an implementation. The principal purpose of a language, however, is to shield the programmer from implementation details, and to let him think exclusively in terms of the higher-level abstraction. Hence, a language should be fully defined without reference to any implementation or computer.

    The language Oberon was born out of the ambition to simplify language to the essentials. The result turned out to be a considerably more powerful and more elegant language than its predecessors The defining report of Pascal required 30 pages, that of Modula grew to 45 pages, Oberon's could do with 16 [12]. Not surprisingly, implementations profited substantially from this achievement after 25 years of experience in language design, from a continuous evolution.

    One of the simplifications was the reunification of the definition and implementation parts of modules. An Oberon module is again defined by a single text. Its heading contains a single list of server modules (rather than of individual, imported objects). Declarations of objects that are to be accessible in client modules are specially marked. Unmarked, local objects remain hidden. From a didactic point of view this reunification may be regretted, because ideally definition parts are designed among team members and form contracts between them, whereas implementation parts can thereafter be designed by individual members without regard for the others, as long as the definition part remains untouched. However, the proliferation of files and the burden to keep corresponding parts consistent was considered a drawback. Moreover, reunification eliminated the compiler's duty to check for consistency between definition and implementation parts. Also, a definition part can readily be extracted from the module text by a simple tool. Extract: 4 Conclusions and Outlook
    4 Conclusions and Outlook
    In his article about Oberon, M. Franz wrote in [13]: "The world will remember Niklaus Wirth primarily as 'that language guy' and associate his name with Pascal." His observation is accurate; also the invitation to speak at this Conference hinted that I should concentrate on Pascal. My disobedient digression stems from my conviction that its successors Modula and Oberon are much more mature and refined designs than Pascal. They form a family, and each descendant profited from experiences with its ancestors. At the end, the time span was 25 years.

    Why, then, did Pascal capture all the attention, and Modula and Oberon got so little? Again I quote Franz: "This was, of course, partially of Wirth's own making". He continues: "He refrained from ... names such as Pascal-2, Pascal+, Pascal 2000, but instead opted for Modula and Oberon". Again Franz is right. To my defense I can plead that Pascal-2 and Pascal+ had already been taken by others for their own extensions of Pascal, and that I felt that these names would have been misleading for languages that were, although similar, syntactically distinct from Pascal. I emphasized progress rather than continuity, evidently a poor marketing strategy.

    But of course the naming is by far not the whole story. For one thing, we were not sufficiently active -- today we would say aggressive -- in making our developments widely known. Instead of asking what went wrong with Modula and Oberon, however, let us rather ask what went right with Pascal. In my own perception, the following factors were decisive:

    1. Pascal, incorporating the concepts of structured programming, was sufficiently different and progressive from Fortran to make a switch worth while. Particularly so in America, where Algol had remained virtually unknown.

    2. In the early 1970s, an organization of users (Pascal User Group PUG) was formed and helped to make Pascal widely known and available. It also published a Newsletter.

    3. Pascal was ported just in time for the first micro computers (UCSD) [14], and thereby reached a large population of newcomers unconstrained by engrained habits and legacy code.

    4. Pascal was picked up by start-up companies. Borland's Pascal implementation was the first compiler to be available for less than $50, turning it into a household article.

    5. UCSD as well as Borland properly integrated the compiler into a complete development tool, including a program editor, a file system, and a debugging aid. This made Pascal highly attractive to schools and to beginners. It changed the manner in which programs were "written". A fast write-compile-test-correct cycle and interactivity were the new attractions.

    6. Shortly after the initial spread, an ever growing number of text books on programming in Pascal appeared. They were as important as the compiler for Pascal's popularity in schools and universities. Text books and software entered a symbiotic partnership.
    Perhaps it is worth observing that this chain reaction started around 1977, fully seven years after Pascal had been published and implemented on a CDC mainframe computer. Meanwhile, Pascal had been ported to numerous other large computers, but remained largely within universities. This porting effort was significantly facilitated by our project resulting in the Pascal P-compiler generating P-code, the predecessor of the later M-code (for Modula) and Java byte-code.
    In contrast to Pascal, Modula and Oberon did not appear at a time when computing reached new segments of the population. The module concept was not perceived in teaching as sufficiently significant to warrant a change to a new, albeit similar language. Text books had been selected, investments in learning had been made, time was not ripe for a change. Industry did not exactly embrace Modula either, with a few exceptions, mainly in Britain. A more palatable solution was to extend Pascal, retaining upward compatibility and old shortcomings. And there appeared competition in the form of C++ and Ada with powerful industrial backing.
    Oberon fared even worse. It was virtually ignored by industry. This is astounding, as not only the elegant and powerful language was presented in 1988, but also a compact and fast compiler in 1990, along with a modern, flexible development environment for workstations, complete with window system, network, document and graphics editor, neatly fitting into about 200 Kbytes of memory, the compiler alone taking less than 50 Kbytes. The entire system was fully described in a single, comprehensive book, including its source code [15]. It carried the proof that such a system could be built using only a small fraction of manpower typically allotted to such endeavors, if proper methods and tools were employed.
    One is tempted to rationalize this history with recent, deep changes in the computing field. Computing resources are no longer at a premium, memory is counted in megabytes rather than kilobytes as 15 years ago, instruction cycles in nanoseconds instead of microseconds, and clock frequencies in gigahertz instead of megahertz. The incentive to economize has dwindled alarmingly. The only scarce resource is manpower, competent manpower. Industry has difficulties even in finding good C-programmers, those of finding Oberon programmers are insurmountable. So how could one reasonably expect companies to adopt Oberon?
    We recognize a deadlock: The adoption of new tools and methods is impracticable, yet the retention of the old ones implies stagnation. Are we therefore condemned to eternally produce an ever growing mountain of software of ever growing complexity, software that nobody fully understands, although everybody is well aware of its defects and deficiencies? In order to avert this highly unpleasant vision of the future, some fresh starts will have to undertaken now and then. They require the courage to discard and abandon, to select simplicity and transparency as design goals rather than complexity and obscure sophistication.
    In the field of programming languages two fresh starts have been attempted recently. I refer to Java and C#. Both tend to restrict rather than augment the number of features, trying to suggest a certain programming discipline. Hence they move in the right direction. This is encouraging. But there remains still a long way to reach Pascal, and longer even to Oberon.

          in Software Pioneers: Contributions to Software Engineering, Bonn, 28-29. 6. 2001 eds Broy, Manfred and Denert, Ernst Springer 2002 view details
  • Library of Congress Subject Headings P2 view details
          in Software Pioneers: Contributions to Software Engineering, Bonn, 28-29. 6. 2001 eds Broy, Manfred and Denert, Ernst Springer 2002 view details
    Resources
    • Cover of book