ARGUS(ID:3841/arg003)

Automatic Routine Generating and Updating System 


for Automatic Routine Generating and Updating System

Assembly System for Honeywell 800
one with FACT and Algebraic(???) making up a full solution suite for to 800

Places
Related languages
ARGUS => Algebraic Compiler   Sibling
ARGUS => FACT   Sibling
ARGUS => Mercury Assembler   Replacement for
ARGUS => Metropolitan Honeywell Compromise Compiler   Target language for

References:
  • "Honeywell EDP Presents A FEW QUICK FACTS ON SOFTWARE" Datamation October 1961 view details Extract: Honeywell ARGUS
    Honeywell ARGUS
    An assembly program for the Honeywell 800, called ARGUS, includes an easy-to-use assembly language, library of routines and generators, and all the operational programs that simplify program writing, reduce errors, and keep the need for human intervention to a minimum, thus utilizing the computer in the most efficient manner, there are versions of ARGUS for both punched-card and magnetic-tape systems.
  • BCS Bulletin - Literature and References to Simplified Programming Schemes for Computers, Available or Projected - November 1961 view details
  • Sammet, Jean E "1960 Tower of Babel" diagram on the front of CACM January 1961 view details

          in [ACM] CACM 4(01) (Jan 1961) view details
  • Weik, Martin H. "A Third Survey of Domestic Electronic Digital Computing Systems" Rpt 1115, BRL, Maryland, 1961 view details External link: Online copy at Computer History Museum Extract: LARC details
    Univac LARC is designed for large-scale business data processing as well as scientific computing. This includes any problems requiring large amounts of input/output and extremely fast computing, such as data retrieval, linear programming, language translation, atomic codes, equipment design, largescale customer accounting and billing, etc.

        University of California
        Lawrence Radiation Laboratory
        Located at Livermore, California, system is used for the
        solution of differential equations.
    [?]
    Outstanding features are ultra high computing speeds and the input-output control completely independent of computing. Due to the Univac LARC's unusual design features, it is possible to adapt any source of input/output to the Univac LARC. It combines the advantages of Solid State components, modular construction, overlapping operations, automatic error correction and a very fast and a very large memory system.
    [?]
    Outstanding features include a two computer system (arithmetic, input-output processor); decimal fixed or floating point with provisions for double
    precision for double precision arithmetic; single bit error detection of information in transmission and arithmetic operation; and balanced ratio of high speed auxiliary storage with core storage.
    Unique system advantages include a two computer system, which allows versatility and flexibility for handling input-output equipment, and program interrupt on programmer contingency and machine error, which allows greater ease in programming.
          in [ACM] CACM 4(01) (Jan 1961) view details
  • Bemer, R "ISO TC97/SC5/WGA(1) Survey of Programming Languages and Processors" December 1962 view details
          in [ACM] CACM 6(03) (Mar 1963) view details
  • Harwell, J. C. "How is FACT getting on?" Computer Bulletin 6(4) March 1963 pp137-138 view details Extract: How is FACT...
    The last little peak in the publicity noise level that denoted an item about FACT seems to have been some time ago. There have been several confident pronouncements from various implementers about little brother COBOL. However, where a working COBOL compiler is actually being used a similar quietness appears to prevail about how it is getting on.
    For some time after a compiler has reached a state in which it is fit to be released errors continue to be found in it. The rate at which corrections are sent out to users drops steadily until eventually months pass without an error being found. The moment at which a compiler is, in that sense, error-free is not a clearly marked stage. It is in any case difficult for a manufacturer to announce six months after the release fanfare that his compiler is now actually fit to be used.
    There is another reason for the prevailing quietness re-experience in use of commercial compilers. That is that the original excitement was too great. The reality continues to promise great things for the future. But in the meantime it is just a little drab. Why should this be so if the compilers work correctly to the original specification upon which the hopes were based?
    Perhaps, because it was going to be so easy to write programs, it was naturally assumed that the programs written would be perfectly written. By and large they are correctly written but the relentlessness with which a computer explores every logical crevice in a program means that the old need for accurate machine code now becomes a need for precision in the plain language statements. And this precision must relate to the effect of each word, to the interpretation of symbols by the compiler and to all the combinations that occur in the program.
    There is about a plain language for a computer a deceptive air of meaning what it says. Plain language is inter-man use fits the man context. To apply it to the computer context by direct analogy is fine so long as the analogy fits; generally it fits very badly. The best fit is in a statement like GO TO TOWN-HALL where there exists a paragraph in the program labelled TOWN-HALL. We know, with a. smile, what it means. But the COBOL statement MOVE CORRESPONDING CUSTOMER-ACCOUNT OF MASTER TO MAIL-LABEL OF NEW-MASTER requires a fair deal of thinking about, and that thinking is neither about a computer nor about any clear human situation. The situation is a contrived one and its concepts are clear and hard only in terms of the compiling process and the object computer configuration. Each word adds its own to the total meaning of the sentence and the sentence has no meaning apart from that achieved in that synthetic way.
    In human talk, a sentence is generally more than the sum of its parts, it is frequently a single symbol of which the component words do no more than enable it to be recognised. There are, virtually, such sentences in FACT. For example the statement SEE INPUT-EDITOR could equally well be replaced by the slightly less pompous GET-ON-WITH-READING-THE-CARDS-NOW, at the cost to the programmer of remembering it as a special case. It is obviously preferable that the SEE INPUT-EDITOR form be used because it is connected by the SEE, DO, PERFORM rules to the rest of the language. It is immaterial in this case whether the programmer writes SEE INPUT-EDITOR because he remembers that as the phrase to use or because he wishes to link in a generated INPUT-EDITOR procedure to the flow of his program.
    But it is not necessarily immaterial which approach he uses with other statements in a programming language. If he is to write correct programs he must know the effect of each word that he sets down within the context of the surrounding words and descriptions. The programmer may rely on the conceptual coherence of the portion of the language with which he is concerned in order to know the effect of the statements which he writes, or he may refer to given rules on the correct use of each word and on its correct relationship with others.
    This article does not attempt an analysis of either FACT or COBOL to establish the underlying ideas. FACT and COBOL are clearly different in their conceptual structure in spite of the seeming similarity of words, syntax and field of operation. What it does attempt is to account for the rather odd difficulties being encountered in this stage of development and use of commercial data-processing languages.
    A powerful programming language such as FACT offers a means of manipulating business data flexibly and accurately, A beginner can write correct but straightforward programs to do an enormous amount of computing. Gradually he can incorporate in his programs all the facilities which FACT contains. But this is another way of saying that in order to use FACT facilities properly and fully, the programmer requires to learn it in detail.
    FACT is well-founded in its file structure. The FACT file hierarchy is conceptually simple and at the same time appropriate to the ordinary complexity of business data processing. COBOL files are founded conceptually on blocks of information assembled in a core memory buffer (agreed that business data processing can be moulded to this without too much difficulty). What matters is that the programmer makes some of his errors in fitting the language concepts to the job and others in fitting the statements permitted in the language to the concepts.
    Large fields of potential error are removed by the provision in FACT of generated routines for the input of whole files of punched cards and the output of whole series of printed documents in accordance with static descriptions instead of by program statements for sequential execution by the computer. The programmer's difficulty of using time-sequential steps to arrange and determine logical and physical relationships is removed. By such means FACT brings the programmer nearer to reality than even machine-code allows. But from within the given universe the particular has to be specified and accuracy is essential. While the FACT approach is different from and much easier than that which COBOL adopts it also requires a little more care than might at first sight seem necessary. The data and relationships must be correctly described.
    Even the FACT statement "SORT file-name TO file-name,  CONTROL ON Key-1, Key-2,-------Key-n" which seems very simple can be wrongly used. While it will sort multi-reel files if correct, it will fail on half-a-dozen records if appropriate keys are not stated for the groups on the file. There is a certain minimum amount of information to be given before even a hand-sort can be performed and if less is given the sort cannot succeed.
    No programmer ever knowingly writes a wrong statement: either another programmer or the computer makes plain the error. In the present stage when experience is being gained in the use of FACT and COBOL it is the computer which is the main agency in the inspection of programs. Much desk checking is certainly possible, but there is a limit to how much desk checking can reasonably be done. The computer will as easily reveal ten clerical errors in a program as the one error still remaining after a thorough check. In the case where one clerical error obscures another this applies with less force, but generally the computer will be used for the checking of clerical errors.
    Honeywell use a short "diagnostic" version of the FACT compiler to produce a listing of each program input to it; together with explanatory comments on the errors found in it. On the H-800 computer it takes from 2 to 25 minutes running time to obtain a diagnostic listing of a FACT source program. The listing is sent to the programmer who makes any needed correction to his program. Theoretically this process is repeated once; in practice it is, at the moment, repeated several times before the listing is free from diagnostic comments.
    Compilation of FACT programs on the H-800 requires from 15 to 50 minutes of machine running time. The compilation process produces another listing of the source program and from a few hundred up to many thousand words of program in ARGUS Assembly code. This is batched with ARGUS code programs from other sources and assembled on to a symbolic program magnetic tape where it is available for subsequent program testing and production scheduling and for alteration by hand-made changes if these should be wanted.
    Program testing initially reveals, almost always, some logical error in the source program. With FACT the error is generally easily traced and a small change in the program is made. One then either recompiles or makes a correction to the ARGUS coding.
    The choice made between recompilation and correction of the ARGUS coding, partly reflects one's view of the programming language. One may see the language as a means of programming the computer or one may see it as a means of producing large amounts of lower-level program code for the computer. To re-compile requires extra machine-time but keeps the documentation straight and is, if machine-time is available, quick and simple. To patch the ARGUS coding requires a knowledgeable ARGUS programmer and means that the program listing is no longer in line with the object program on tape, but saves machine-time.
    In short, it is unexpected but true that plain language programmers make errors and these may necessitate several diagnostic runs and several compilations and assemblies. While the labour cost of writing such programs is a fraction of that for writing an equivalent amount of assembly code, the cost of machine-time is apparently very high. It is probably, however, no higher than would be required by the ARGUS programmers for assembling and testing the equivalent job within the lower level assembly language. The proportion of machine time to programming time changes enormously with the use of a compiler.
    As experience is gained in writing plain language programs the frequency with which a complicated program works first time of trying will rise. The skill of the programmer and the complexity of the problem together determine the cost of programming and a small change in either might make a very large difference to the cost. Unskilled programmers will be able, with a little assistance, to write straightforward programs correctly.
    The solution to the present problems lies in the skill of the programmer. The rewards of automatic coding are large and they are now available; they are those which have been the motive force behind the language design and implementation effort; however the hope that a complete layman would be able to use the full power of a language after three or four days instruction must be, for the moment, abandoned. When machine-time is severely limited the expert FACT programmer, and even more the COBOL programmer, is going to have to know the language intimately, the operation of the compiler fairly well, the intermediate assembly code and the machine. The next stage, starting now, is going to be one in which success in getting cheap and rapid production results will be normal.
    Already with only two FACT programmers, three large applications have been programmed in under eight months. Early inexperience led to the loss of much time in repeated compilations, but even in those circumstances the use of FACT compares favourably with the use of assembly language. To produce by hand the coding of over 450,000 three-address instructions involved in the applications, even if somewhat more condensed, would require many man years of effort.

          in [ACM] CACM 6(03) (Mar 1963) view details
  • Weik, Martin H. "A Fourth Survey of Domestic Electronic Digital Computing Systems" Report No. 1227, January 1964 Ballistic Research Laboratories, Aberdeen Proving Ground, Maryland view details External link: Online copy at Computer History Museum
          in [ACM] CACM 6(03) (Mar 1963) view details
  • Neff, B. L. "Do-it-yourself software experience" view details Extract: Early programming experience
    Early programming experience
    To return to the software story: those of us who did the original programming on Univac I in 1953 and 1954 started out being thoroughly indoctrinated with the virtues of machine coding. In 1953 there wasn't much of anything else anyhow. In 1954 we were very fortunate in getting a "compiler," produced at New York University by Roy Goldfinger. It amounted to what we would nowadays call an assembler, in that it allowed subroutines to be stored on a library tape, to be called in by a programmer at assembly time. We recoded it in our shop to add some features we thought desirable, later recoded it again for Univac II, and it is still one of the basic Metropolitan assembly systems for Univac II programs. It is a sobering reflection on the fantastic progress of computer technology in the last 10 years to remember how avant-garde this assembly system appeared to us in 1954, and what a great help it was in our programming work; then to come back to the present day, when an assembly system is taken for granted, like a self-starter on an automobile.
    A few years after our 1954 start, the first ideas about using problem-oriented languages for coding were being discussed, advocated, sneered at, and praised. We programmers thought these efforts quite impractical, but our higher management saw them as rays of hope on the horizon. Systems and claims for systems that used actual English language words came into being: B-0 Flowmatic, COBOL, FACT. We tried out the B-0 system on our own machines. The early efforts at implementing these languages on the small-memory machines then available produced results that were such that we had no trouble recognizing their impracticability. But these systems had a point which was inescapable: the country simply did not contain enough clever machine-oriented programmers, or people who could be so trained, to cope with the explosion of computer programming that was occurring. So a few of us stopped scoffing, and considered that we didn't really object to the general language approach of these new systems— it was the implementation we didn't like, i.e. the volume of coding generated, and its efficiency in terms of running time of the generated program.
    Extract: First compiler
    First compiler
    In 1959 therefore we decided to make up our own English Language Compiler for the Univac II, designing the language ourselves by considering real existing programs, and requiring it to handle everything we could think of. Before the language was frozen, it was applied to actual production runs, large and complex, to establish its practicability. We were conscious at the time that some new, as yet unknown, machine would replace Univac II within a few years, and tried to keep the language reasonably machine-independent.
    This first effort in compilers in our shop was working by the end of 1960, and the coding produced was not bad. A basic decision was made during the implementation phase, which we have held to in our later compiler work, and which I consider important: it was decided that the efficiency of the coding produced was far more important than the efficiency of the compiling process. That is, we did not care whether compiling time was fast or slow, but the object program produced had to run fast, and had to use memory space economically. The coding produced had to compare favourably with direct machine-language coding by a human being, although we realized it could never be quite that good. At that time, and you can still see it today, claims of competing compilers seemed to emphasize compiling time exclusively: "Our compiler produced this program in only three minutes, while the Brand-X compiler took five minutes." The Metropolitan Univac Compiler took about an hour, sometimes longer, to produce a full 2,000 word program. This is one of the reasons why I stressed earlier in my talk the importance of considering the environment in which software is used before evaluating it. Our data processing activity features many recurring jobs, few one-shot jobs. The extra time we take to compile is more than recaptured once a job goes into periodic application. Obviously a service bureau would take the opposite point of view, and correctly.
    Extract: Compromise compiler for H-800
    Temporary compiler for H-800
    To return to the English Language Compiler situation: we started immediately to work on an H-800 version, that is, one to produce H-800 coding, but wisely decided to program it for the Univac II. This had several immediate advantages. First, we had a fluency with Univac II language that we knew would take some time to develop with H-800 language, and we knew how to debug Univac programs well, with six years' experience. Second, we could test it out as we went, on our own machines in our own shop, without getting into a queue for time on the few H-800 machines then working, all of which were outside our office. Third, about one-third of the job was already done, since the preliminary syntactical analysis in the working Univac compiler required few modifications to serve the same purpose in a Honeywell compiler. Of course, we knew it could not be a permanent solution, but the fact that we intended to phase out the Univac II machines only very gradually made it a very acceptable solution.
    I can't resist pointing out that this decision, to implement an H-800 compiler on a Univac II, is one that only a user could have made, and is a good example of the advantages, sometimes unexpected, of a computer user furnishing some part of his software requirements himself. It was a slick way of getting work onto the new machines quickly. It is entirely possible that our inability to do this type of work would have set our whole time-table back a year or more. One of the many good points we considered about the H-800 was its relatively early availability, compared with some other machines we studied—this advantage might have lost its appeal if we had not been confident that our own staff would be adequately supplied with tools to exploit the advantage of an early hardware installation.
    This hybrid compiler was called the "Compromise Compiler"—a misleading name perhaps, but we did not feel that the name had much importance except to identify it. We have never been very keen on tricky names in the Metropolitan computer division. The wholly H-800 compiler that eventually replaced the Compromise Compiler really has no name at all, except "The Compiler"—The Executive Routine is simply called that, or sometimes "Met-Exec." We marvel at the ingenuity that other people display in coming up with those wonderful acronyms like COBOL, ADMIRAL, FACT, SOAP, and the like. We wonder where they find the time for such research. Many acronyms have such an ultimate flavour about them that they seem to leave little room for improvement. The original UNIVAC, for example, which is now carefully preserved in the Smithsonian Institution in Washington, along with Lindbergh's airplane, had a name that meant "Universal Automatic Computer." Where can one go from there? Nor is this a peculiarly American disease: I had pangs of envy and admiration when I first heard about the great Nebula compiler in Orion; that one deserved a prize. I suppose I can sum up these parenthetical remarks by saying that we don't care what people call it, provided it works.
    A few minor modifications in the compiler language were made in adapting the compiler to the H-800. These had to do with methods of identifying tapes, and some related to the matter of running under the control of an Executive routine. No changes at all affected the processing language itself, and it was fairly easy to spot and change Univac statements that required changing in order to recompile on the Compromise Compiler for the H-800.
    Although the New York machines originally had 16,000-word memories, our smaller machines in San Francisco and Ottawa, installed in 1962, had 12,000 and 8,000-word memories, and in order to be able to run each other's programs, we restricted H-800 program size initially to 6,000 words of memory. This restriction had the desirable effect of permitting parallel operation to some extent, at least in New York. Where the Univac Compiler took about one hour to produce programs restricted to the size of the Univac 2000-word memory, the Compromise Compiler took from two to three hours to produce an H-800 6,000-word program— the correlation between time of compiling and memory size of the compiled program was quite close. Later, in 1963, we doubled memory size on the New York machines to 32,000 words each, and the San Francisco and Ottawa machines were increased to 16,000 words each, and at the same time we raised the permissible size of a program to 12,000 words. This of course made compiling time rather enormous on those few programs that went to very large sizes, but it occurred at a time when enough work had been transferred from Univac to Honeywell to make such time available. The "pure" Honeywell compiler (i.e. compiling runs on the H-800 for the H-800) went into operation in September of 1964,
    and the whole problem of excessive Univac compiling time promptly disappeared. We had scrapped one of our Univac II machines in May of 1964 (we tried to sell it, but no one wanted it even as a gift), in order to provide floor space for an H-1800, and it was a great relief when the new compiler started operating four months later. The risk of not having enough time to compile for new trials being worked on during those four months was not a real one, since we could have bought some time outside—but this was not necessary, as it turned out.
    During the years that we used the Compromise Compiler, about 500 programs were produced by it, and no other programming system had to be used for regular production type work. Programmer confidence in the system is very high, and has continued high with the introduction of the newer all-Honeywell compiler. One simple indication of this acceptance is the fact that when bugs show in test sessions the normal reaction is to examine the statements for errors first, not the coding produced. Project leaders and managerial personnel above the programmer level are completely sold on the system, and it is generally felt that the progress we have made over the last few years would not have been possible without the compiler.
    Extract: Final Honeywell compiler
    Final Honeywell compiler
    Work started on the "pure" Honeywell compiler about 27 months prior to its completion in September of 1964. There was less urgency, except at the very end, and we wanted to gather up all those tidbits of hindsight that the two earlier compilers gave us, and really do a good job. Rule: if you want a job done well, do it the third time. After only one year of its development, the partially completed compiler started to pay large dividends, since its very complete and fast diagnostic section was working by then. This enabled us to find programmers' errors very rapidly on the H-800, then send to Univac only those programs that were clean—this simple step cut Univac compiler time requirements in half.
    Language changes for the third compiler were confined to liberalizations of some prior restrictions—there were no changes that required rewriting older programs for the new system. A king-size program that requires 12,000 words of memory can be compiled in less than an hour—the average run is about 30 minutes. We are now compiling about 20-25 programs per month, not counting about twice that number per month of partial runs that reveal programmer logical errors.
    Some of the features of our latest compiler may be of interest. Half-way through a compilation an English Language analyzer is produced, which lists the original statements' text, and shows all cross-references between statements. This is valuable for making later program changes. Lists are also given of the various files, data nouns, storage areas and tables used in the program, showing the statement numbers of all processing steps that refer to each. If there are any errors in the program this list is also present, and the compiler quits without going further.
    The Honeywell symbolic coding language, which the compiler produces, contains a provision for inserting "remarks" lines among the coding. The compiler uses this facility in two ways: first, each small section of coding, which corresponds to a clause of a statement, is preceded by remarks that give the statement and clause number, as well as the English text of the clause. Second, the entire original text of the program is converted to remarks lines that appear at the end of the program listings. Having all this information in one package helps program maintenance considerably.
    Extract: Verbosity defended
    Verbosity defended
    A frequently heard criticism of English Language programming is that it is too verbose in stating a problem. Some suggest allowing one to write RD instead of READ, or A instead of ADD. A similar complaint has to do with our insistence on programmers attaching
    meaningful names to data fields—instead of calling them INSURANCE, COMMISSION, PREMIUM, the suggestion is to allow something short, like Al, A2, A3, on the grounds that the programmer will know what he is doing, and why write all those extra letters? There are several replies to this line of thinking:

    (1)  The redundancy in "ADD," as compared with "A," is a valuable error-detection device, to guard against some random key-punch error changing the meaning of a whole processing step, without immediate detection. If A stands for ADD, and S for SUBTRACT, how easy is it for a careless operator to punch one instead of the other?    But mere carelessness will never give  SUBTRACT instead of ADD.    Very well then, but why not  allow   SBTRCT  instead   of SUBTRACT?    The answer is that anyone can spell SUBTRACT, but why need one take the time to look up SBTRCT in a list of authorized abbreviations?

    (2)  An English Language Compiler is not merely an instrument for converting a problem-oriented language into a machine-oriented language.    It is also a communication  and  documentation  device,  which  allows close supervision of programmers during the development phase of a project, as well as providing for easy transfer of programs from one group of programmers to another.    Where a program has been doing steady production work for a year and now needs revision, use of this type of compiler facilitates the job of refamiliariza-tion.   All of these advantages tend to drop in effectiveness if there are special symbolics used whose meaning is not readily obvious.

    (3)  In any event, no one programs at dictation speed, or even writing speed.   The amount of time it takes to decide what to write is far greater than the amount of time it takes to write it.
    Extract: ARGUS and Mercury
    I will return to the compiler story later, but it might be well to mention here three other aspects of H-800 software which had to be evaluated. One was the ARGUS assembly system, for converting mnemonic coding into machine coding, and bringing library routines into a completely assembled program. Another software question, though not a program, was the Honeywell convention system in regard to data tape layout, handling of signs, rules regarding the handling of parallel-running programs, etc. The third major question was the Executive Routine.
    The ARGUS assembly system appeared to be in good working order, and although we did not want to use all of its various features, the ones we needed seemed satisfactory, so we decided to leave well enough alone in this area, for a few years' time at least. Much later, in December 1963, we started work on a replacement for it, and this is now coming into use in the Company. The replacement, which we call "Mercury", is much faster, and has tighter controls.

          in The Computer Journal 8(4) 1965 view details
  • Stock, Karl F. "A listing of some programming languages and their users" in RZ-Informationen. Graz: Rechenzentrum Graz 1971 21 view details Abstract: 321 Programmiersprachen mit Angabe der Computer-Hersteller, auf deren Anlagen die entsprechenden Sprachen verwendet werden kennen. Register der 74 Computer-Firmen; Reihenfolge der Programmiersprachen nach der Anzahl der Herstellerfirmen, auf deren Anlagen die Sprache implementiert ist; Reihenfolge der Herstellerfirmen nach der Anzahl der verwendeten Programmiersprachen.

    [321 programming languages with indication of the computer manufacturers, on whose machinery the appropriate languages are used to know.  Register of the 74 computer companies;  Sequence of the programming languages after the number of manufacturing firms, on whose plants the language is implemented;  Sequence of the manufacturing firms after the number of used programming languages.]
          in The Computer Journal 8(4) 1965 view details
  • Sammet, Jean E., "Programming languages: history and future" view details
          in [ACM] CACM 15(06) (June 1972) view details