REDUCE 2(ID:7012/red009)Improved vesion of REDUCE with new formatversion 2 of REDUCE Added features from other systems, and "Along with most current systems, REDUCE 2 has adopted a 'natural' form of expression output first developed by Jonathan Millen of Carl Engelman's MATHLAB project." Related languages
References: Introduction During the last few years, considerable progress has been made in the solution of algebraic problems by computer. At the same time, system builders have ~een provided with the necessary impetus to produce better systems by studying the reactions of users to programs available during this period. The version of the REDICE system which is described in this paper, and which I shall refer to as REDUCE 2, has been produced in response to such a challenge. By studying the use and misuse of the previous versions, it has been possible to supply many of the facilities which appear to be desirable in an algebraic simplification program. REDUCE in one form or another has been around for over seven years now. Originally it began as a system for solving some particular problems which arise in high energy physics, where much tedious repetitive calculation is involved. However, it was quickly recognized that the simplification processes being used were quite general, and in 1967 REDUCE was announced as a system for general purpose algebraic simplification and released for distribution. Over 25 installations now have REDUCE operating, and over 30 publications acknowledging its use have appeared in the literature. Most of these successes have been in the field of high energy physics, and some of these were discussed in my earlier talk. However, there have been other applications in chemical and design engineering and in celestial mechanics. In developing a completely new program of the scope of REDUCE 2, it is natural that a debt is owed to many other systems and people for their ideas. In particular, I am most grateful to the producers of FORMAC, MATHLAB, SCRATCHPAD, and SYMBAL for some of their good ideas which have been adopted in REDUCE 2. I am even more indebted to the many users of the earlier versions of REDUCE whose continual suggestions for improvement motivate much of my present work. The latest version of REDUCE offers users the following facilities for the solution of their problems: 1) Expansion and ordering of rational functions; 2) symbolic differentiation of rational functions; 3) substitutions and pattern matching in a wide variety of forms; 4) automatic and user controlled simplification of expressions; 5) calculations with symbolic matrices; 6) procedural facilities for defining functions and extending the system; 7) calculations of interest to high energy physicists including spin 1/2 and spin I algebra; 8) tensor operations. In general terms, REDUCE is command oriented rather than program oriented. This is a consequence of the fact that REDUCE is intended primarily for interactive calculations in a tlme-shared environment, since the result of a given command may be required before proceeding on to the next step. The syntax of individual commands bears closest resemblance to ALGOL 60 (8) of any of the well known algebraic languages. In fact, REDUCE could be regarded as an extension of a subset of ALGOL 60. A program as such consists of a sequence of commands, each of which is interpretively evaluated by the system before proceeding on to the next command. REDUCE can also operate, of course, in a batch processing environment. There is a basic difference, however, between interactive and batch use of the system. In the former case, whenever the system discovers an ambiguity at some point in a calculation, such as a forgotten type assignment for instance, it asks the user at that point for the correct interpretation. In batch operation, it is not practical to terminate the calculation at such points and require resubmlsslon of the job, so the system makes what it considers to be the most reasonable guess at the requirement and continues. To a certain extent, interactive use can be simulated in a batch system, as the contents of core can usually be saved at any point in the calculation. The calculation can then be continued from that point in a later Job. Of course, this can be frustratingly slow if the job turn around time is at all large. In addition to being a system for algebraic manipulation, REDUCE is also a language for the statement of problems. The language is complete enough to enable the whole REDUCE program to be written in its own source language. However, a subset of LISP 1.5 is still used as an intermediate language, thus allowing for easy implementation of the same program on several different machines. REDUCE 2 is presently available for use on most IBM 360 or 370 Series computers, the DEC PDP-10 and the CDC 6400, 6500, 6600 and 7600 machines. Information on obtaining a copy of REDUCE for any of these machines is available from the author. It is impossible in a paper this size to give a complete syntactical definition of the REDUCE language of an account of all the commands available. This information is contained in the user's Manual (I0) which is available on request. This paper will therefore limit itself to a deserlption of the most interesting features of the system, illustrated as often as possible by working examples. ![]() Conclusion REDUCE as it stands today is still a growing system. There remain some obvious deficiencies. For example, there are no facilities for symbolic factorizatlon or integration. Of course the LET command can be used to perform integration by table lookup and pattern matching, and this has been successfully used in several calculations (19). However, we hope to implement an operator INT which will affect the integration of a wide class of functions. We are watching with great interest the work of Joel Moses and his collaborators and plan to implement some of their ideas in REDUCE. The problem of output expression swell, which has concerned me for several years, has still not been solved. We are currently using our new pattern matching algorithm in attempts to solve this problem, but it is obvious that a lot more work is needed. There are also many facilities in REDUCE which have not been discussed in this paper, but which are described in full in the Manual. For example, because of their specialized nature, I have said nothing about the improved package of commands available for solving problems in high energy physics, although I did discuss this facility in my earlier talk. However, it is hoped that the reader has been given some idea of the great power of the system. It should also be stressed that this is no toy experimental system. The previous version has been in use for over three years now, and most of the improvements in the current system are in response to user'suggestions. A debugging service is offered to users, and we are collecting a small library of application programs. Persons interested in learning more about this system should contact the author. in [ACM] Proceedings of the Second Symposium on Symbolic and Algebraic Manipulation, March 23-25, 1971 Los Angeles (SYMSAM 71) view details Introduction Some of the contributions from the 72 diagrams contributing to the iy3 part of the anomalous magnetic moment of the electron [a, = &(g, -2)] have already been Recently the measurement of a, has been performed by Wesley and Rich5 with an accuracy of 6 ppm, so that a knowledge of the remaining contributions is urgently needed. One of the main features of this kind of calculation is the amount of algebra involved. A great part of it can only be done reasonably by computer. In fact, in the above-mentioned calculations, computer techniques were broadly used. Two programs devoted to symbolic manipulations were mainly used: the REDUCE system of Hearn, and SCHOONSCHIP, a machine code program developed by Veltman.7 In the method we are reporting, a slightly different method is used, and all the manipulations involved in the calculations are performed by a computer. The program written by one of ,us (J.C.) is intended to be general enough to calculate the contributions of all of the relevant diagrams. To do this, we had to develop a suitable method of renormalization and the appropriate calculational techniques. It appeared that the functional formalism and the well-known Feynman method were suitable for our purpose. The formulation of the renormalization theory in the framework of the functional formalism permits us to determine unambiguously the counterterms which have to be subtracted from a diagram to make its contribution finite. This, and the fact that the tensorial dependence of the skeleton divergence of a diagram contributing to a, is in y,, have been used to develop a numerical method of cancellation of the ultraviolet divergences. The main features of this formulation are given in Sec. II. Section III is devoted to a survey of the program. Without entering into technical details, we stress its possibilities and limitations. This program is written in the LISP programming language." It takes as input a set of expressions describing a diagram and gives in the output the contributions to a, in terms of multidimensional integrals over the Feynman parameters. It is also possible to compute either the anomalous magnetic moment of the muon (a,) or the difference a, -a, of the electron and muon magnetic moments. The integrals in hand are numerically estimated by means of a subroutine due to Scheppey, Dufner, and Lautrup. Section IV contains a brief description of this subroutine and describes how the infrared divergences are removed, again using a numerical method. The last section is devoted to the latest results obtained. We have chosen to compute first the diagrams with vacuum polarization insertions. While we were compiling our results, a paper by Brodsky and Kinoshita appeared, reporting on the same calculations. Both results are in agreement. in [ACM] Proceedings of the Second Symposium on Symbolic and Algebraic Manipulation, March 23-25, 1971 Los Angeles (SYMSAM 71) view details REDUCE 2 Starting from a canonical form suited to multivariate polynomials, REDUCE 2 extends itself by a variety of means to the manipulation of quite general mathematical expressions, relying on the polynomial procedures for basic simplification. It is quite different from seminumerical programs in its support of a (rather constrained) user-defined pattern replacement facility and of a 1 1/2dimensional (exponents raised, but no nice two-dimensional (2-D) fractions like those in Fig. 2) mathematical display program A REDUCE program for computing the F and G series mentioned at the beginning of this article would look like Fig. 5. The REDUCE 2 system supplies, in addition to its general-purpose routines, a significant facility specialized to the multilinear algebra associated with high-energy physics. This part of the system has proved extremely successful, with numerous published physics papers citing it as the computational vehicle. Extract: FORMAC The best known, purely symbolic systems are, of course, Formac and its current version PL/IFORMAC (Petrick, 1971; pp. 105-114). Formac was the first widely available general-purpose algebraic manipulation system and served for a period to define the field. Certainly, there was a time when one could have safely made the statement that the majority of all mechanical symbolic mathematical computations had been done within Formac. The practical success of these systems, in spite of their rigidity with respect to user modifications and their lack of any seminumerical facilities for rational function computations, is probably due to the overall intelligence of the facilities that were provided. Above all, they were certainly sufficient to support the dominant application area of truncated power series expansion. Current support is minimal. Extract: Symbolic systems SYMBOLIC SYSTEMS. We should mention first a sequence of three early programs for the simplification of general symbolic mathematical expressions represented as prefix-notation tree structures. The first, at M.I.T., was due to Hart, and the other two were due to Wooldridge and Korsvold at Stanford. The latter has survived in current usage as a result of its incorporation, subject to modification, into the MATHLAB, MACSYMA, and SCRATCHPAD systems. In the mid-1960s there appeared two systems, Formula Algol and FAMOUS, which, while dedicated to the symbolic manipulation of mathematical expressions, presented the user with almost no built-in automatic simplification facilities. This was due, at least in the case of FAMOUS, to a conscious decision that, since the "simplicity" of an expression is surely context- dependent, it should be reasonable to present the user with complete control over the simplification process. That is, the user'should be compelled to define all transformations, rather than, as with most systems, be permitted simply to switch on and off the transformations supplied by the system architects. No system of this species has ever solved the inherent efficiency problems to the extent that it could serve more than didactic purposes. Probably neither Formula Algol nor FAMOUS could be revived today. Another lost symbolic system of importance is the Symbolic Mathematical Laboratory of W. A. Martin. This system provided high-quality 2-D graphics on a DEC-340 display and was also the first to employ a light pen for subexpression selection. In some ways, it represented a degree of interaction that has not been duplicated by any subsequent system. Nor were its innovative internal programming techniques restricted to its graphics facilities. Of particular interest is the use of hash coding for subexpression matching (Petrick, 1971; pp. 305-310). in Encyclopedia of Computer Science, Ralston, Anthony, and Meek, Chester L. (eds) New York, NY Petrocelli/Charter 1976 view details in Proceedings of the Third ACM symposium on Symbolic and algebraic computation, August 10-12, 1976, Yorktown Heights, New York, United States view details A BRIEF HISTORICAL SKETCH ------------------------- The development of systems for symbolic mathematical computation first became an active area of research and implementation during the decade 1961-1971. . . . . . . To put the decade 1961-1971 into perspective, let us recall that FORTRAN appeared about 1958 and ALGOL in 1960. These two languages were designed primarily for numerical mathematical computation. Then in 1960/1961 came the development of LISP, a language for list processing. LISP was a major advancement on the road to languages for symbolic computation. An operation such as symbolic differentiation which is foreign to FORTRAN and ALGOL is relatively easy in LISP. (Indeed this is one of the standard programming assignments for students first learning LISP.) As will be noted later, several computer algebra systems were written in LISP. 1961-1966 --------- In 1961, James Slagle at M.I.T. wrote a LISP program called SAINT for Symbolic Automatic INTegration. This was one of the earliest applications of LISP to symbolic computation and it was the first comprehensive attempt to program a computer to behave like a freshman calculus student. The program was based on a number of heuristics for indefinite integration and it performed about as well as a good calculus student. One of the first systems for symbolic computation was FORMAC, developed by Jean Sammet, Robert Tobey, and others at IBM during the period 1962-1964. It was a FORTRAN preprocessor (a PL/I version appeared later) and it was designed for the manipulation of elementary functions including, of course, polynomials and rational functions. Another early system was ALPAK, a collection of FORTRAN-callable subroutines written in assembly language for the manipulation of polynomials and rational functions. It was designed by William S. Brown and others at Bell Laboratories and was generally available about 1964. A language now referred to as Early ALTRAN was designed at Bell Laboratories during the period 1964-1966. It used ALPAK as its package of computational procedures. There were two other significant systems for symbolic computation developed during this period. George Collins at IBM and the University of Wisconsin (Madison) developed PM, a system for polynomial manipulation, an early version of which was operational in 1961 with improvements added to the system through 1966. The year 1965 marked the first appearance of MATHLAB, a LISP-based system for the manipulation of polynomials and rational functions, developed by Carl Engelman at M.I.T. It was the first interactive system designed to be used as a symbolic calculator. Included among its many firsts was the use of two-dimensional output to represent its mathematical output. The work of this period culminated in the first ACM Symposium on Symbolic and Algebraic Manipulation held in March 1966 in Washington, D.C. That conference was summarized in the August 1966 issue of the Communications of the ACM. 1966-1971 --------- In 1966/1967, Joel Moses at M.I.T. wrote a LISP program called SIN (for Symbolic Integrator). Unlike the earlier SAINT program, SIN was algorithmic in approach and it was also much more efficient. In 1968, Tony Hearn at Stanford University developed REDUCE, an interactive LISP-based system for physics calculations. One of its principal design goals was portability over a wide range of platforms, and as such only a limited subset of LISP was actually used. The year 1968 also marked the appearance of Engelman's MATHLAB-68, an improved version of the earlier MATHLAB interactive system, and of the system known as Symbolic Mathematical Laboratory developed by William Martin at M.I.T. in 1967. The latter was a linking of several computers to do symbolic manipulation and to give good graphically formatted output on a CRT terminal. The latter part of the decade saw the development of several important general purpose systems for symbolic computation. ALTRAN evolved from the earlier ALPAK and Early ALTRAN as a language and system for the efficient manipulation of polynomials and rational functions. George Collins developed SAC-1 (for Symbolic and Algebraic Calculations) as the successor of PM for the manipulation of polynomials and rational functions. CAMAL (CAMbridge Algebra system) was developed by David Barton, Steve Bourne, and John Fitch at the University of Cambridge. It was implemented in the BCPL language, and was particularly geared to computations in celestial mechanics and general relativity. REDUCE was redesigned by 1970 into REDUCE 2, a general purpose system with special facilities for use in high-energy physics calculations. It was written in an ALGOL-like dialect called RLISP, avoiding the cumbersome parenthesized notation of LISP, while at the same time retaining its original design goal of being easily portable. SCRATCHPAD was developed by J. Griesmer and Richard Jenks at IBM Research as an interactive LISP-based system which incorporated significant portions of a number of previous systems and programs into its library, such as MATHLAB-68, REDUCE 2, Symbolic Mathematical Library, and SIN. Finally, the MACSYMA system first appeared about 1971. Designed by Joel Moses, William Martin, and others at M.I.T., MACSYMA was the most ambitious system of the decade. Besides the standard capabilities for algebraic manipulation, it included facilities to aid in such computations as limit calculations, symbolic integration, and the solution of equations. The decade from 1961 to 1971 concluded with the Second Symposium on Symbolic and Algebraic Manipulation held in March 1971 in Los Angeles. The proceedings of that conference constitute a remarkably comprehensive account of the state of the art of symbolic mathematical computation in 1971. 1971-1981 --------- While all of the languages and systems of the sixties and seventies began as experiments, some of them were eventually put into "production use'' by scientists, engineers, and applied mathematicians outside of the original group of developers. REDUCE, because of its early emphasis on portability, became one of the most widely available systems of this decade. As a result it was instrumental in bringing computer algebra to the attention of many new users. MACSYMA continued its strong development, especially with regard to algorithm development. Indeed, many of the standard techniques (e.g. integration of elementary functions, Hensel lifting, sparse modular algorithms) in use today either came from, or were strongly influenced by, the research group at M.I.T. It was by far the most powerful of the existing computer algebra systems. SAC/ALDES by G. Collins and R. Loos was the follow-up to Collins' SAC-1. It was a non-interactive system consisting of modules written in the ALDES (Algebraic DEScription) language, with a translator converting the results to ANSI FORTRAN. One of its most notable distinctions was in being the only major system to completely and carefully document its algorithms. A fourth general purpose system which made a significant mark in the late 1970's was muMATH. Developed by David Stoutemyer and Albert Rich at the University of Hawaii, it was written in a small subset of LISP and came with its own programming language, muSIMP. It was the first comprehensive computer algebra system which could actually run on the IBM family of PC computers. By being available on such small and widely accessible personal computers, muMATH opened up the possibility of widespread use of computer algebra systems for both research and teaching. In addition to the systems mentioned above, a number of special purpose systems also generated some interest during the 1970's. Examples of these include: SHEEP, a system for tensor component manipulation designed by Inge Frick and others at the University of Stockholm; TRIGMAN, specially designed for computation of Poisson series and written in FORTRAN by W. H. Jeffreys at University of Texas (Austin); and SCHOONSCHIP by M. Veltman of the Netherlands for computations in high-energy physics. Although the systems already mentioned have all been developed in North America and Europe, there were also a number of symbolic manipulation programs written in the U.S.S.R. One of these is ANALITIK, a system implemented in hardware by V. M. Glushkov and others at the Institute of Cybernetics, Kiev. 1981-1991 --------- Due to the significant computer resource requirements of the major computer algebra systems, their widespread use remained (with the exception of muMATH) limited to researchers having access to considerable computing resources. With the introduction of microprocessor-based workstations, the possibility of relatively powerful desk-top computers became a reality. The introduction of a large number of different computing environments, coupled with the often nomadic life of researchers (at least in terms of workplace locations) caused a renewed emphasis on portability for the computer algebra systems of the 1980's. More efficiency (particularly memory space efficiency) was needed in order to run on the workstations that were becoming available at this time, or equivalently, to service significant numbers of users on the time-sharing environments of the day. This resulted in a movement towards the development of computer algebra systems based on newer "systems implementation'' languages such as C, which allowed developers more flexibility to control the use of computer resources. The decade also marked a growth in the commercialization of computer algebra systems. This had both positive and negative effects on the field in general. On the negative side, users not only had to pay for these systems but also they were subjected to unrealistic claims as to what constituted the state of the art of these systems. However, on the positive side, commercialization brought about a marked increase in the usability of computer algebra systems, from major advances in user interfaces to improvements to their range of functionality in such areas as graphics and document preparation. The beginning of the decade marked the origin of MAPLE. Initiated by Gaston Gonnet and Keith Geddes at the University of Waterloo, its primary motivation was to provide user accessibility to computer algebra. MAPLE was designed with a modular structure: a small compiled kernel of modest power, implemented completely in the systems implementation language C (originally B, another language in the "BCPL family'') and a large mathematical library of routines written in the user-level MAPLE language to be interpreted by the kernel. Besides the command interpreter, the kernel also contained facilities such as integer and rational arithmetic, simple polynomial manipulation, and an efficient memory management system. The small size of the kernel allowed it to be implemented on a number of smaller platforms and allowed multiple users to access it on time-sharing systems. Its large mathematical library, on the other hand, allowed it to be powerful enough to meet the mathematical requirements of researchers. Another system written in C was SMP (Symbolic Manipulation Program) by Stephen Wolfram at Caltech. It was portable over a wide range of machines and differed from existing systems by using a language interface that was rule-based. It took the point of view that the rule-based approach was the most natural language for humans to interface with a computer algebra program. This allowed it to present the user with a consistent, pattern-directed language for program development. The newest of the computer algebra systems during this decade were MATHEMATICA and DERIVE. MATHEMATICA is a second system written by Stephen Wolfram (and others). It is best known as the first system to popularize an integrated environment supporting symbolics, numerics, and graphics. Indeed when MATHEMATICA first appeared in 1988, its graphical capabilities (2-D and 3-D plotting, including animation) far surpassed any of the graphics available on existing systems. MATHEMATICA was also one of the first systems to successfully illustrate the advantages of combining a computer algebra system with the easy-to-use editing features on machines designed to use graphical user-interfaces (i.e. window environments). Based on C, MATHEMATICA also comes with its own programming language which closely follows the rule-based approach of its predecessor, SMP. DERIVE, written by David Stoutemyer and Albert Rich, is the follow-up to the successful muMATH system for personal computers. While lacking the wide range of symbolic capabilities of some other systems, DERIVE has an impressive range of applications considering the limitations of the 16-bit PC machines for which it was designed. It has a friendly user interface, with such added features as two-dimensional input editing of mathematical expressions and 3-D plotting facilities. It was designed to be used as an interactive system and not as a programming environment. Along with the development of newer systems, there were also a number of changes to existing computer algebra systems. REDUCE 3 appeared in 1983, this time with a number of new packages added by outside developers. MACSYMA bifurcated into two versions, DOE-MACSYMA and one distributed by SYMBOLICS, a private company best known for its LISP machines. Both versions continued to develop, albeit in different directions, during this decade. AXIOM, (known originally as SCRATCHPAD II) was developed during this decade by Richard Jenks, Barry Trager, Stephen Watt and others at the IBM Thomas J. Watson Research Center. A successor to the first SCRATCHPAD language, it is the only "strongly typed'' computer algebra system. Whereas other computer algebra systems develop algorithms for a specific collection of algebraic domains (such as, say, the field of rational numbers or the domain of polynomials over the integers), AXIOM allows users to write algorithms over general fields or domains. As was the case in the previous decade, the eighties also found a number of specialized systems becoming available for general use. Probably the largest and most notable of these is the system CAYLEY, developed by John Cannon and others at the University of Sydney, Australia. CAYLEY can be thought of as a "MACSYMA for group theorists.'' It runs in large computing environments and provides a wide range of powerful commands for problems in computational group theory. An important feature of CAYLEY is a design geared to answering questions not only about individual elements of an algebraic structure, but more importantly, questions about the structure as a whole. Thus, while one could use a system such as MACSYMA or MAPLE to decide if an element in a given domain (such as a polynomial domain) has a given property (such as irreducibility), CAYLEY can be used to determine if a group structure is finite or infinite, or to list all the elements in the center of the structure (i.e. all elements which commute with all the elements of the structure). Another system developed in this decade and designed to solve problems in computational group theory is GAP (Group Algorithms and Programming) developed by J. Neubueser and others at the University of Aachen, Germany. If CAYLEY can be considered to be the "MACSYMA of group theory,'' then GAP can be viewed as the "MAPLE of group theory.'' GAP follows the general design of MAPLE in implementing a small compiled kernel (in C) and a large group theory mathematical library written in its own programming language. Examples of some other special purpose systems which appeared during this decade include FORM by J. Vermaseren, for high energy physics calculations, LiE, by A.M. Cohen for Lie Algebra calculations, MACAULAY, by Michael Stillman, a system specially built for computations in Algebraic Geometry and Commutative Algebra, and PARI by H. Cohen in France, a system oriented mainly for number theory calculations. As with most of the new systems of the eighties, these last two are also written in C for portability and efficiency. Research Information about Computer Algebra ------------------------------------------- Research in computer algebra is a relatively young discipline, and the research literature is scattered throughout various journals devoted to mathematical computation. However, its state has advanced to the point where there are two research journals primarily devoted to this subject area: the "Journal of Symbolic Computation" published by Academic Press and "Applicable Algebra in Engineering, Communication and Computing" published by Springer-Verlag. Other than these two journals, the primary source of recent research advances and trends is a number of conference proceedings. Until recently, there was a sequence of North American conferences and a sequence of European conferences. The North American conferences, primarily organized by ACM SIGSAM (the ACM Special Interest Group on Symbolic and Algebraic Manipulation), include SYMSAM '66 (Washington, D.C.), SYMSAM '71 (Los Angeles), SYMSAC '76 (Yorktown Heights), SYMSAC '81 (Snowbird), and SYMSAC '86 (Waterloo). The European conferences, organized by SAME (Symbolic and Algebraic Manipulation in Europe) and ACM SIGSAM, include the following whose proceedings have appeared in the Springer-Verlag series "Lecture Notes in Computer Science": EUROSAM '79 (Marseilles), EUROCAM '82 (Marseilles), EUROCAL '83 (London), EUROSAM '84 (Cambridge), EUROCAL '85 (Linz), and EUROCAL '87 (Leipzig). Starting in 1988, the two streams of conferences have been merged and they are now organized under the name ISSAC (International Symposium on Symbolic and Algebraic Computation), including ISSAC '88 (Rome), ISSAC '89 (Portland, Oregon), ISSAC '90 (Tokyo), ISSAC '91 (Bonn) and ISSAC '92 (Berkeley). ----------------------------------------------- Professor Keith Geddes Symbolic Computation Group Department of Computer Science University of Waterloo Waterloo ON N2L 3G1 CANADA in Proceedings of the Third ACM symposium on Symbolic and algebraic computation, August 10-12, 1976, Yorktown Heights, New York, United States view details |