MATH-MATIC(ID:435/mat008)Mathematically oriented autocodeMarketing name for the AT-3 compiler by Charles Katz et al, at UNIVAC In the initial approach letter by the GAMM to the president of ACM in 1957, this language was considered closest (along with Perlis' IT) to the ideal language for which they were aiming (more so than FORTRAN). Subsequent correspondence confirms this Hardware: Related languages
Samples: References: Let me elaborate these points with examples. UNICODE is expected to require about fifteen man-years. Most modern assembly systems must take from six to ten man-years. SCAT expects to absorb twelve people for most of a year. The initial writing of the 704 FORTRAN required about twenty-five man-years. Split among many different machines, IBM's Applied Programming Department has over a hundred and twenty programmers. Sperry Rand probably has more than this, and for utility and automatic coding systems only! Add to these the number of customer programmers also engaged in writing similar systems, and you will see that the total is overwhelming. Perhaps five to six man-years are being expended to write the Alodel 2 FORTRAN for the 704, trimming bugs and getting better documentation for incorporation into the even larger supervisory systems of various installations. If available, more could undoubtedly be expended to bring the original system up to the limit of what we can now conceive. Maintenance is a very sizable portion of the entire effort going into a system. Certainly, all of us have a few skeletons in the closet when it comes to adapting old systems to new machines. Hardly anything more than the flow charts is reusable in writing 709 FORTRAN; changes in the characteristics of instructions, and tricky coding, have done for the rest. This is true of every effort I am familiar with, not just IBM's. What am I leading up to? Simply that the day of diverse development of automatic coding systems is either out or, if not, should be. The list of systems collected here illustrates a vast amount of duplication and incomplete conception. A computer manufacturer should produce both the product and the means to use the product, but this should be done with the full co-operation of responsible users. There is a gratifying trend toward such unification in such organizations as SHARE, USE, GUIDE, DUO, etc. The PACT group was a shining example in its day. Many other coding systems, such as FLAIR, PRINT, FORTRAN, and USE, have been done as the result of partial co-operation. FORTRAN for the 705 seems to me to be an ideally balanced project, the burden being carried equally by IBM and its customers. Finally, let me make a recommendation to all computer installations. There seems to be a reasonably sharp distinction between people who program and use computers as a tool and those who are programmers and live to make things easy for the other people. If you have the latter at your installation, do not waste them on production and do not waste them on a private effort in automatic coding in a day when that type of project is so complex. Offer them in a cooperative venture with your manufacturer (they still remain your employees) and give him the benefit of the practical experience in your problems. You will get your investment back many times over in ease of programming and the guarantee that your problems have been considered. Extract: IT, FORTRANSIT, SAP, SOAP, SOHIO The IT language is also showing up in future plans for many different computers. Case Institute, having just completed an intermediate symbolic assembly to accept IT output, is starting to write an IT processor for UNIVAC. This is expected to be working by late summer of 1958. One of the original programmers at Carnegie Tech spent the last summer at Ramo-Wooldridge to write IT for the 1103A. This project is complete except for input-output and may be expected to be operational by December, 1957. IT is also being done for the IBM 705-1, 2 by Standard Oil of Ohio, with no expected completion date known yet. It is interesting to note that Sohio is also participating in the 705 FORTRAN effort and will undoubtedly serve as the basic source of FORTRAN-to- IT-to-FORTRAN translational information. A graduate student at the University of Michigan is producing SAP output for IT (rather than SOAP) so that IT will run on the 704; this, however, is only for experience; it would be much more profitable to write a pre-processor from IT to FORTRAN (the reverse of FOR TRANSIT) and utilize the power of FORTRAN for free. in "Proceedings of the Fourth Annual Computer Applications Symposium" , Armour Research Foundation, Illinois Institute of Technology, Chicago, Illinois 1957 view details in "Proceedings of the Fourth Annual Computer Applications Symposium" , Armour Research Foundation, Illinois Institute of Technology, Chicago, Illinois 1957 view details in "Proceedings of the Fourth Annual Computer Applications Symposium" , Armour Research Foundation, Illinois Institute of Technology, Chicago, Illinois 1957 view details in "Proceedings of the Fourth Annual Computer Applications Symposium" , Armour Research Foundation, Illinois Institute of Technology, Chicago, Illinois 1957 view details in "Proceedings of the Fourth Annual Computer Applications Symposium" , Armour Research Foundation, Illinois Institute of Technology, Chicago, Illinois 1957 view details in "Proceedings of the Fourth Annual Computer Applications Symposium" , Armour Research Foundation, Illinois Institute of Technology, Chicago, Illinois 1957 view details in [ACM] CACM 1(07) July 1958 view details We had full intentions of using the IT language for our input to the Univac, as Mr. Bemer reported here last year, then changed our minds and decided that we needed something with "name" variables similar to Fortran or AT-3 (Mathmatic). We also desire to use a language that will be usable on both the Univac and the 650 without too much changing-around of the pseudo-code. This desire is motivated largely by both the presence of the two machines as well as by the very nature of our main activity, namely, educational work. The advantages of a common language (or, more properly, a compatible language) are quite clear but not easily attained on two machines of such different capacities. At the present time we are planning to use a multistage process in translating from the pseudo-code (problem-oriented language) to the machine language. It is likely that we will process the original pseudo-code into a simpler pseudo-code, then translate that code, etc., according to the following list: POL POL1 SML Preprocessor Unisap C-10 POL will be an algebraic problem-oriented language which will be quite rich in its abilities (possibly including such things as general summation notation, matrix manipulation, "find maximum," "find minimum," differentiation, integration, etc.). The POLl will be another algebraic problem-oriented language with about the same capabilities as presently found in AT-3 (Math-matic) or Fortran. SML is Simple Machine Language and is a language which has the characteristics of a computer but no reference at all to any particular computer. We have already completed the specifications of SML and how it is to work. Following SML will come a "preprocessor," so named because it was invented for the very purpose of preceding Unisap before its utility in this particular connection was evident. The preprocessor is currently being debugged. The coding for SML to preprocessor is now being written. Unisap has already been mentioned as an excellent example of a symbolic assembly program and has been widely distributed. C-10 neither needs any explanation (it is the Univac machine language) nor does it particularly deserve explanation. Extract: SOAP, CASE SOAP I shall now descend from my blue-sky position and describe what we have already finished doing in the automatic programming business. We, as have several others, started our first efforts at automatic programming after taking a long close look at Dr. Perlis' IT compiler for the IBM 650. As usual, when one examines something created elsewhere, we thought that we could make some improvements in the existing scheme without giving up any of the advantages that were already present. Our first effort was then to write a program named "Compiler II-SOAP II." This program worked very well, turned out nice "tight" coding, and included IT as a subset language. The bulk of the flow charting, planning, and coding for Compiler II-SOAP II was done by two of my undergraduate students, Mr. Lynch, a sophomore, and Mr. Knuth, a freshman. 0ne of the more unpleasant experiences encountered in the work on the compiler was the discovery that SOAP II was unable to assemble the entire compiler owing to the symbol table becoming filled up at an early stage of the assembly process. The solution to the problem was obvious but not very satisfactory. As a matter of fact we did modify SOAP II to dump the symbol table and then reload it again in modified form, but we abandoned this philosophy as not being a worthwhile solution to the problem. Therefore, Mr. Knuth suggested that he write a new symbolic assembly program with some new features incorporated in it. Accordingly, SOAP III (later renamed CASE-SOAP III due to some rather peculiar complaints from a large corporation) was written. CASE-SOAP-III solved the symbol-table difficulty by introducing a fairly new idea--the program point. Program points are addresses which the programmer needs to introduce in order to cause the machine to function properly but which have no mnemonic value to the functioning of the program. For example, one may frequently encounter the use of addresses such as NEXT and NXT, etc., which are included solely to link the program together but which have no significance at all in the logical structure of the problem. The program point was introduced just to solve the problem of naming these "random" addresses. The program point uses no room at all in the symbol table and is continuously redefinable by the simple expedient of using the same point over again. As an example consider the following segment of coding: RAU ROO02 BMI 1F STL PO005 RSL ONE6 1F 1 ALO PCON In both cases of the use of "1F" in the instruction address position, the meaning is to use the address of program point ''1" forward. Now look at the next example (which, by the way, does nothing) : 2 RAU ROO01 1F 1 SUP ONES NZE 1F 2B 1 RAL 6 1B Note that the use of "1F" still refers to the forward program point, while "1B" refers to the correspondingly numbered backward program point. The use of "6" in the data address of the last instruction refers to the address of that particular line of coding (i.e., the line "1 RAL 6 1B"). Using CASE-SOAP III to write our next compiler revealed that there was an economy of approximately 50 per cent in the number of symbols used in CASE-SOAP III over the equivalent program written in SOAP II. Since we are looking forward to acquiring some day some of the optional attachments for our 650, Mr. Knuth incorporated additional pseudo-operations into CASE-SOAP III which allow one to effectively write programs for the augmented machine. The new compiler referred to in the previous paragraph has been named RUNCIBLE. I shall now stick out my neck and state quite flatly that I do not believe that any tighter program will ever be devised for the 650. RUNCIBLE is again an algebraic problem-oriented compiler for the 650. The input language includes both IT and Compiler II-SOAP II as proper subset languages. The compiler is in a single deck of cards but exists logically in twenty-four different versions. The various versions are selected at read-in time by the storage-entry switch settings. The various possibilities are given from examining the following: (Multipass OR One-pass) AND (650 OR 653) AND (Minimum Clocking OR Full Clocking OR No Clocking) AND (Normal Output OR Error Search) = (Operating Mode) We now examine in some detail the options appearing in the foregoing equation. The multipass mode of operation is the one usually associated with 650 compilers (i.e., an intermediate language step is used). In the case at hand, the intermediate language is CASE-SOAP III. The one-pass mode turns out a machine-language program directly in a five instruction per card format. In the second option the term 650 refers to the basic 2,000-word 650 as being used as the object machine (i.e., the machine on which the compiled problem is to be run), the term 653 means that the compiled program is to be run on a machine equipped with floating point, index registers, and (at the programmer's option) core storage. It is also possible to cause the output machine language to be tailored to fit a machine equipped with core storage only and neither index registers nor floating point. The various "clocking" options mentioned refer to the several "tracing" modes that can be called upon to help debug the object program. It is possible to ascertain (while actually operating the object program) upon which statement in the pseudolanguage the machine is working. This one feature is proving to be one of the most helpful debugging aids that we have ever encountered. Extract: IT, Runcible The input language to RUNCIBLE is an augmented version of Compiler II-SOAP II which includes names of subroutines and some other English-language options in the control words. A sample program which appears on page 20 of the manual follows : ![]() The program reads in a non-negative value (and tests to see that it is non-negative) of I1 and then calculates I1 factorial and punches both the input number and the factorial value. We are now naturally looking forward to the day when we shall have to consider a 650 compiler for the machine with tapes and RAMAC unit. In conclusion let me state that in our opinion the most important thing to worry about, as far as application of machines to scientific and engineering problems is concerned, is the time consumed from the statement of the problem in English to the presentation of the numerical answer to the problem proposer. We see no reason why the time consumed on the computer should be the only time interval which comes under close scrutiny. It seems much more reasonable to conserve the time of the scientist and engineer rather than the machine-after all, one may obtain machines by the simple expedient of ordering them from the manufacturer, but a foolproof, economic method of obtaining competent personnel is as yet an almost unsolved problem. Extract: MATH-MATIC vs. RUNCIBLE M. O. LOCKS (Remington Rand Univac) : Could you give me your interpretation of the difference between an assembly system and a compiler system? Also, could you explain how something which is as flexible as a compiler can be gotten onto a 2,000-word drum with no further erasable storage? MR. WAY: Our view of an assembly system is a facility whereby one writes a program in some language-usually similar to machine language-and gets out an essentially one-to-one translation of that language. In our case we write in a symbolic language, using pseudo-codes and symbolic addresses, and get out a machine-language code along with the actual locations-in the case of the Univac I, for instance, C-10 operation codes and addresses-when we assemble the tape which will run the problem. A compiler, on the other hand, is usually a one-to-many translator. You put in, usually, algrebraic statements of your problem and get out a machine code to do the problem; the latter, of course, in appearance bears no resemblance whatsoever to the original problem statement. Now your question about how we do this on a 2,000-word drum is of interest. A number of people said it could not be done, but Perlis proved that this was wrong. I notice that you are with Remington Rand Univac, so I see why the question comes up! If anybody ever writes a compiler that will fit inside the 1,000- word memory of the Univac I, I would ce~tainly like to see it. The whole philosophy of compiling is dictated by the machine available to do it. In the case of the Univac I, with ten tape units, one can spin wheels and do nearly anything if he has time. As examples, witness Math-matic and Flow-matic, which do spin wheels and take time, but do compile. With the 650 one does not use this type of philosophy, mainly because he cannot. So, one uses a different method of generating the machine language. I realize I haven't told you in detai! how to write a compiler; I didn't intend to. Does that answer your question? MR. LOCKS: Yes, that does basically answer the question, but I would like to inquire how much your 650 compiler can do as compared to a compiler which does spin wheels, Math-matic, for instance. MR. WAY: I am sure we can run rings around it with this 650 compiler, the basic reason being that the 650 storage of 2,000 words allows us to get more in the main Current Developments in Computer Pvogremming Techniques 131 storage of the machine at once. In our view the Math-matic system at present is rather handicapped by the input-output of the Univac I. The man who writes pseudo-code in Math-matic must know a lot about the Univac I, or he will not be able successfully to segment a problem. At least this has been our experience. Extract: FORTRANSIT, IT, RUNCIBLE R. B. WISE (Armour Research Foundation) : You mentioned that you were going to bypass the Perlis language compiler for the Univac I and write something more ambitious. Will this allow for IT compiler input? MR. WAY: We haven't decided as yet, but we would like to allow it if possible. MR. WISE: I, for one, would want to register a small protest if you do not. The IT language is about the closest thing we have today to the universal language among computers, and it seems to me that it would be very much worthwhile to have something-even something which some people may classify as mediocrewhich will allow communication among computers. This has been written for the 650; through the use of For Transit, you have access to some programs for the 704; it has been written for the 1103A; and, in somewhat modified form, it also fits in the Datatron 205. I think there is a big argument for allowing that type of input. MR. WAY: YOU opened a loophole when you said "in somewhat modified form." When I said before that our compiler would accept previous programs, I meant without any alteration. in Proceedings of the 1958 Computer Applications Symposium, Armour Research Foundation, Illinois Institute of Technology, Chicago, Illinois view details in E. M. Crabbe, S. Ramo, and D. E. Wooldridge (eds.) "Handbook of Automation, Computation, and Control," John Wiley & Sons, Inc., New York, 1959. view details Assembly and Compiling Systems both obey the "pre-translation"7 principle. Pseudo instructions are interpreted and a running program is produced before the solution is initiated. Usually this makes possible a single set of references to the library rather than many repeated references. In an assembly system the pseudo-code is ordinarily modified computer code. Each pseudo instruction refers to one machine instruction or to a relatively short subroutine. Under the control of the master routine, the assembly system sets up all controls for monitoring the flow of input and output data and instructions. A compiler system operates in the same way as an assembly system, but does much more. In most compilers each pseudo instruction refers to a subroutine consisting of from a few to several hundred machine instructions.8 Thus it is frequently possible to perform all coding in pseudo-code only, without the use of any machine instructions. From the viewpoint of the user, compilers are the more desirable type of automatic programming because of the comparative ease of coding with them. However, compilers are not available with all existing equipments. In order to develop a compiler, it is usually necessary to have a computer with a large supplementary storage such as a magnetic tape system or a large magnetic drum. This storage facilitates compilation by making possible as large a running program as the problem requires. Examples of assembly systems are /Symbolic Optimum Assembly Programming (S.O.A.P.) for the IBM 650 and REgional COding (RECO) for the UNIVAC SCIENTIFIC 1103 Computer. The X-l Assembly System for the UNIVAC I and II Computers is not only an assembly system, but is also used as an internal part of at least two compiling systems. Extract: MATHMATIC, FORTRAN and UNICODE For scientific and mathematical calculations, three compilers which translate formulas from standard symbologies of algebra to computer code are available for use with three different computers. These are the MATH-MATIC (AT-3) System for the UNIVAC I and II Computers, FORTRAN (for FOR-mula TRANslation) as used for the IBM 704 and 709, and the UNICODE Automatic Coding System for the UNIVAC SCIENTIFIC 1103A Computer. Extract: FLOW-MATIC and REPORT GENERATOR Two advanced compilers have also been developed for use with business data processing. These are the FLOW-MATIC (B-ZERO) Compiler for the UNIVAC I and II Computers and REPORT GENERATOR for the new IBM 709.13 In these compilers, English words and sentences are used as pseudocode. in E. M. Crabbe, S. Ramo, and D. E. Wooldridge (eds.) "Handbook of Automation, Computation, and Control," John Wiley & Sons, Inc., New York, 1959. view details employ new methods in many areas of research. Performance of 1 million multiplications on a desk calculator is estimated to require about five vears and to cost $25,000. On an early scientific computer, a million multiplications required eight minutes and cost (exclusive of programing and input preparation) about $10. With the recent LARC computer, 1 million multiplications require eight seconds and cost about 50 cents (Householder, 1956). Obviously it is imperative that researchers examine their methods in light of the abilities of the computer. It should be noted that much of the information published on computers and their use has not appeared in educational or psychological literature but rather in publications specifically concerned with computers. mathematics, engineering, and business. The following selective survey is intended to guide the beginner into this broad and sometimes confusing area. It is not an exhaustive survey. It is presumed that the reader has access to the excellent Wrigley (29571 article; so the major purpose of this review is to note additions since 1957. The following topics are discussed: equipment availabilitv, knowledge needed to use computers, general references, programing the computer, numerical analysis, statistical techniques, operations research, and mechanization of thought processes. Extract: Compiler Systems Compiler Systems A compiler is a translating program written for a particular computer which accepts a form of mathematical or logical statement as input and produces as output a machine-language program to obtain the results. Since the translation must be made only once, the time required to repeatedly run a program is less for a compiler than for an interpretive system. And since the full power of the computer can be devoted to the translating process, the compiler can use a language that closely resembles mathematics or English, whereas the interpretive languages must resemble computer instructions. The first compiling program required about 20 man-years to create, but use of compilers is so widely accepted today that major computer manufacturers feel obligated to supply such a system with their new computers on installation. Compilers, like the interpretive systems, reflect the needs of various types of users. For example, the IBM computers use "FORTRAN" for scientific programing and "9 PAC" and "ComTran" for commercial data processing; the Sperry Rand computers use "Math-Matic" for scientific programing and "Flow-Matic" for commercial data processing; Burroughs provides "FORTOCOM" for scientific programming and "BLESSED 220" for commercial data processing. There is some interest in the use of "COBOL" as a translation system common to all computers. in E. M. Crabbe, S. Ramo, and D. E. Wooldridge (eds.) "Handbook of Automation, Computation, and Control," John Wiley & Sons, Inc., New York, 1959. view details in E. M. Crabbe, S. Ramo, and D. E. Wooldridge (eds.) "Handbook of Automation, Computation, and Control," John Wiley & Sons, Inc., New York, 1959. view details in Goodman, Richard (ed) "Annual Review in Automatic Programming "(1) 1960 Pergamon Press, Oxford view details Most of the papers are written in a style and manner which seem to have become universally accepted for papers on computer programming, at least in the English-speaking world and probably in others. This style insists on a liberal dosage of impressively detailed flow charts which, considering the well-known and understandable reluctance of programmers to read their own programs much less those of others, one suspects most readers hastily skip over, willingly granting their authenticity. The flow charts are invariably accompanied by long lists of special instructions described in the private patois of the author, who seems blissfully unaware or unconcerned that his specially constructed vocabulary of acronyms may present;. rough going to the reader from the inlying provinces. Finally, the style demands long and wearisome descriptions of basic concepts (e.g., subroutine; symbolic instruction, etc.) long since familiar to the average reader, some indication of difficulties as yet to be surmounted (e.g., automatic storage allocation; easier debugging; et al). Nevertheless, the volume does give some idea of the status of automatic programming systems in Great Britain in early 1959. It also contains a concise description of the 709 SHARE operating system, and another brief account of FLOW-MATIC and MATH-MATIC. There are two interesting appendices worthy of mention. Appendix One consists of reprints of two papers by the late A. M. Turing, "On Computable Numbers with an Application to the Entscheidungsproblem", in which the "Turing machine" was conceived, and a brief corrective note on the same subject. Appendix Two contains the "Preliminary Report of ~ ACM-GAMM Committee on an International Algebraic Language", since published elsewhere. The reviewer cannot suppress the question of whether this sort of material (Appendices excepted), so soon obsolescent or obsolete and so difficult to present adequately in short papers, deserves the effort and expense required to reproduce it between the bound hard covers of a handsome book. in ACM Computing Reviews 2(03) May-June 1961 view details in [ACM] CACM 6(03) (Mar 1963) view details The first large scale electronic computer available commercially was the Univac I (1951). The first Automatic Programming group associated with a commercial computer effort was the group set up by Dr. Grace Hopper in what was then the Eckert-Mauchly Computer Corp., and which later became the Univac Division of Sperry Rand. The Univac had been designed so as to be relatively easy to program in its own code. It was a decimal, alphanumeric machine, with mnemonic instructions that were easy to remember and use. The 12 character word made scaling of many fixed-point calculations fairly easy. It was not always easy to see the advantage of assembly systems and compilers that were often slow and clumsy on a machine with only 12,000 characters of high speed storage (200 microseconds average access time per 12 character word). In spite of occasional setbacks, Dr. Hopper persevered in her belief that programming should and would be done in problem-oriented languages. Her group embarked on the development of a whole series of languages, of which the most used was probably A2, a compiler that provided a three address floating point system by compiling calls on floating point subroutines stored in main memory. The Algebraic Translator AT3 (Math-Matic) contributed a number of ideas to Algol and other compiler efforts, but its own usefulness was very much limited by the fact that Univac had become obsolete as a scientific computer before AT3 was finished. The B0 (Flow-Matic) compiler was one of the major influences on the COBOL language development which will be discussed at greater length later. The first sort generators were produced by the Univac programming group in 1951. They also produced what was probably the first large scale symbol manipulation program, a program that performed differentiation of formulas submitted to it in symbolic form. in [AFIPS JCC 25] Proceedings of the 1964 Spring Joint Computer Conference SJCC 1964 view details in [AFIPS JCC 25] Proceedings of the 1964 Spring Joint Computer Conference SJCC 1964 view details The exact number of all the programming languages still in use, and those which are no longer used, is unknown. Zemanek calls the abundance of programming languages and their many dialects a "language Babel". When a new programming language is developed, only its name is known at first and it takes a while before publications about it appear. For some languages, the only relevant literature stays inside the individual companies; some are reported on in papers and magazines; and only a few, such as ALGOL, BASIC, COBOL, FORTRAN, and PL/1, become known to a wider public through various text- and handbooks. The situation surrounding the application of these languages in many computer centers is a similar one. There are differing opinions on the concept "programming languages". What is called a programming language by some may be termed a program, a processor, or a generator by others. Since there are no sharp borderlines in the field of programming languages, works were considered here which deal with machine languages, assemblers, autocoders, syntax and compilers, processors and generators, as well as with general higher programming languages. The bibliography contains some 2,700 titles of books, magazines and essays for around 300 programming languages. However, as shown by the "Overview of Existing Programming Languages", there are more than 300 such languages. The "Overview" lists a total of 676 programming languages, but this is certainly incomplete. One author ' has already announced the "next 700 programming languages"; it is to be hoped the many users may be spared such a great variety for reasons of compatibility. The graphic representations (illustrations 1 & 2) show the development and proportion of the most widely-used programming languages, as measured by the number of publications listed here and by the number of computer manufacturers and software firms who have implemented the language in question. The illustrations show FORTRAN to be in the lead at the present time. PL/1 is advancing rapidly, although PL/1 compilers are not yet seen very often outside of IBM. Some experts believe PL/1 will replace even the widely-used languages such as FORTRAN, COBOL, and ALGOL.4) If this does occur, it will surely take some time - as shown by the chronological diagram (illustration 2) . It would be desirable from the user's point of view to reduce this language confusion down to the most advantageous languages. Those languages still maintained should incorporate the special facets and advantages of the otherwise superfluous languages. Obviously such demands are not in the interests of computer production firms, especially when one considers that a FORTRAN program can be executed on nearly all third-generation computers. The titles in this bibliography are organized alphabetically according to programming language, and within a language chronologically and again alphabetically within a given year. Preceding the first programming language in the alphabet, literature is listed on several languages, as are general papers on programming languages and on the theory of formal languages (AAA). As far as possible, the most of titles are based on autopsy. However, the bibliographical description of sone titles will not satisfy bibliography-documentation demands, since they are based on inaccurate information in various sources. Translation titles whose original titles could not be found through bibliographical research were not included. ' In view of the fact that nany libraries do not have the quoted papers, all magazine essays should have been listed with the volume, the year, issue number and the complete number of pages (e.g. pp. 721-783), so that interlibrary loans could take place with fast reader service. Unfortunately, these data were not always found. It is hoped that this bibliography will help the electronic data processing expert, and those who wish to select the appropriate programming language from the many available, to find a way through the language Babel. We wish to offer special thanks to Mr. Klaus G. Saur and the staff of Verlag Dokumentation for their publishing work. Graz / Austria, May, 1973 in [AFIPS JCC 25] Proceedings of the 1964 Spring Joint Computer Conference SJCC 1964 view details Resources
|