OMNITAB(ID:559/omn007)

Statistical analysis and desk calculator 


The original electronic worksheet

"OMNITAB [...] is designed to provide a close parallel to the modus operandi in carrying out calculations with a desk calculator and a multi-columned (and multi-lined) worksheet. "

"OMNITAB simulates desk computing in that it replaces the desk calculator, the mathematical tables and the multicolumn worksheet. The user is to regard the machine as a 46-column by 101-row tablet on which carry out its work."


Related languages
AARDVARK => OMNITAB   Influence
BMD => OMNITAB   Influence
BOUMAC => OMNITAB   Influence
COGO => OMNITAB   Influence
DAM => OMNITAB   Influence
OMNIFORM => OMNITAB   Evolution of
OMNITAB => IMP   Implementation
OMNITAB => OMNITAB II   Evolution of
OMNITAB => PRECISE   Enhancement of
OMNITAB => REFORM   Evolution of
OMNITAB => Resampling Stats   Influence
OMNITAB => SHRIMP   Implementation
OMNITAB => TABOL   Generalisation of

Samples:
References:
  • "OMNITAB on the 90: an NBS, English-language program" view details Extract: OMNITAB on the 90
    OMNITAB ON THE 90
    an NBS, English-language program
    A computer program that permits scientists and others unfamiliar with programming to communicate with a 7090 by using English sentence commands has been developed by the National Bureau of Standards. It is called OMNITAB.
    The program is used for the calculation of tables of functions, for solutions to non-linear equations, and for statistical and numerical analysis of tabular data. It is designed to allow rapid computation of routine laboratory problems.
    With OMNITAB, various sections of problem analysis can be checked independently to determine proper programming procedures; data can be checked for validity, and one-shot jobs can be done with a working program.
    A wide variety of mathematical and manipulative procedures are available in the OMNITAB routine. There are provisions for raising to powers, use of logarithms to base 10 and base e, elementary and special functions, curve fitting, integration, differentiation, interpolation, etc., in addition to the basic arithmetical operations. The program has a capacity of 7.2K results, arranged in 36 columns of 200 rows each.
    A statistical analysis package which computes the average of a set of numbers (200 maximum) and 30 statistical measures related to the average, dispersion, randomness, and other properties of the distributions, has been incorporated in the program. It is anticipated that this analysis, which takes less than a minute of machine time, will have a standardizing influence on the statistical analysis of laboratory data.
    OMNITAB is the work of Joseph Hilsenrath, Philip J. 1 Walsh and Guy G. Ziegler of NBS.

          in Datamation 9(3) March 1963 view details
  • Cameron J. M. and Hilsenrath J. "Use of general-purpose coding systems for statistical calculations" in Proceedings of the IBM Scientific Computing Symposium on Statistics (1963). view details Extract: Context of OMNITAB

    In the last few years the speed of computers has increased so much that computer time is responsible for only a small fraction of the  of propramming personnel, particularly those who have some  background in physics, chemistry or engineering, which has led many  subject-matter specialists to learn programming. This is of course wasteful,  in the sense that a number of do-it-yourself activities are wasteful, and  also involves an additional loss because of the time taken from the pursuit  of the subject matter in which the programmer is a specialist.

    Although inefficient, it is justified because of the relatively low cost and high speed of modern machines and because of the inaccessibility of professional programmers. The system is unsatisfactory-the scien tist involved becomes more programmer than physicist or chemist and  either has to train each new man to code or end up doing his problems  himself. This provides the motivation for a general-purpose code to himself. This provides the motivation for a general-purpose code to facilitate communication with the machine, a code which requires as little specialized computer knowledge as possible. For the statistician, general-purpose codes for regression, time series, etc., would probably suffice. There may be other equally good collections of codes, but the programs prepared by the UCLA Biomedical Data Processing Group (BMD) come as near to filling the needs of the statistician as we can reasonably hope to get. Yet in many applications the experimenter wants tables made from the results of curve fitting, plots of auxiliary functions, or auxiliary constants such as air density, virtual temperature, etc. In a number of cases the statistical analysis is a minor part of the calculation and printout. It is a misuse  of the statistician's abilities for him to expend his energies on preparing such nonstatistical programs.

    This desire for a general-purpose coding system in which the machine is programmed by the writing of English sentences has brought forth a number of systems: AARDVARK (Iowa State), BOUMAC (National Bureau of Standards Boulder Laboratories), COGO (MIT), DAM (International Monetary Fund), etc. The following quotations from four of these programs show the strong and consistent motivation toward the utilization of a language oriented toward the subject-matter specialist rather than the machine:

    COGO - a computer programming systen1 for civil engineering problems ...has the following characteristics: The instructions or commands to the  computer, which the engineer uses to express the solution of a problem,  are at approximately the same technical language level as instructions which one engineer would use in describing his solution to another engineer. (Miller, 1961, p. 1)

    The above presentation may be regarded as a Problem Oriented Control Language oriented toward Multivariate Analysis. It is felt that this language is close to the language a statistician would use to express his problem to other statisticians so that the statistician wishing to perform certain multivariate analyses has a minimum of computing rules to learn before he can present his problem. (Cooper, 1963, p. 27)

    The program, hereafter referred to as DAM (Data processing And Multiple regression), facilitates the preparation of input data to be used in multiple regression analysis. It enables users without specific programming knowlregression analysis. It enables users without specific programming knowledge to write sequences of desired computations essentially in the form and
    (Boissonneault, 1962, p. I)

    A formal approach to the routine analysis of kinetic data in terms of linear compartmental systems is presented. The methods of analysis are general in that they include much of the theory in common use, such as direct solution of differential equations, integral equations, transfer functions, fitting of data to sums of exponentials, matrix solutions, etc. The key to the formalism presented lies in the fact that a basic operational unit-called "compartment" - has been defined, in terms of which physical and mathematical models as well as input and output functions can be expressed. Additional features for calculating linear combinations of functions and for setting linear dependence relations between parameters add to the versatility of this method. The actual computations for the values of model parameters to yield a least squares fit of the data are performed on a digital computer. A general computer program was developed that permits the routine fitting of data and the evolution of models. (Berman, Weiss and Shahn,1962,p.289)

    As useful as these systems are in their specialized fields, a need for a system directed toward the wide variety of mathematical a a need for a system directed toward the wide variety of mathematical and numerical calculations arising in the physical sciences and neering. OMNITAB (Hilsenrath, 1963) was designed to meet this need. Extract: OMNITAB
    OMNITAB is a completely assembled, interpretive program for the IBM 7090 and 7094 which permits direct use of the machine by scientists or engineers without knowledge of programming. Instructions, written in the form of English sentences, control the flow of calculations , in a manner highly analogous to the logic which prevails in carrying out computations on a desk calculator. More precisely, OMNITAB simulates desk computing in that it replaces the desk calculator, the mathematical tables and the multicolumn worksheet. The user is to regard the machine as a 46-column by 101-row tablet on which carry out its work. The vocabulary and sentence structure are, to a to large extent, self-explanatory and are shown in Figure 1.

    The instructions for the program are given via a series of sencard. Arguments are entered or generated in specified columns; then, card. Arguments are entered or generated in specified columns; then, mathematical or manipulative operations are performed on desired columns ; finally, the results are stored in the designated column. The mathematical operations include the arithmetic operations, the elementary and special functions, and various statistical and numerical analyses. The manipulative operations provide for inverting, promoting, demoting, exchanging, shortening, erasing, printing, and punching columns of numbers.

    Although each instruction is given in sentence form, only the first word (or the first six characters, if the word has more than six characters) of the sentence and the numbers are crucial to the operation. The machine scans the whole card and picks up only the first word and the numbers and ignores the intervening words completely. For this reason, the intervening words arc not really needed. Their use, more easily. In an instruction statement, the absence of a decimal point of a decimal point causes the number to be read as itself. Thus, ADD 1. TO 2 means to add the value 1. a to numbers stored in column 2 while ADD 1 TO 2 means to add the numbers in column 1 to those in colin column 2 while RAISE 2 TO 3 means to raise each of the numbers in column 2 to the power of the corresponding number in column 3.

          in Proceedings of the IBM Scientific Computing Symposium on Statistics 1963 at the Thomas J. Watson Research Center in Yorktown Heights, New York, on October 21, 22 and 23,1963. view details
  • Hilsenrath, J. "0MNITAB: a second generation general purpose Computer program" Nat. Bur. Standards Tech. News Bull., 1963 47: 14-15. view details
          in Proceedings of the IBM Scientific Computing Symposium on Statistics 1963 at the Thomas J. Watson Research Center in Yorktown Heights, New York, on October 21, 22 and 23,1963. view details
  • Hilsenrath, Joseph; Ziegler, Guy G.; Messina, Carla G.; Walsh, Philip J. & Robert J. Herbold "OMNITAB - A Computer Program For Statistical and Numerical Analysis" National Bureau of Standards Handbook 101, Washington, D.C. (1966). view details
          in Proceedings of the IBM Scientific Computing Symposium on Statistics 1963 at the Thomas J. Watson Research Center in Yorktown Heights, New York, on October 21, 22 and 23,1963. view details
  • Beam, Alfred E "PRECISE: A Multiple Precision Version of Omnitab" Technical Note 446: June 1968. U.S. Department of Commerce, National Bureau of Standards view details Abstract: This users manual describes PRECISE - a completely assembled interpretive program for the IBM 7090/7094 which enables the user to carry out arithmetic operations and function generation in multiple precision (accuracy to 28 significant figures). PRECISE operates as a sub-monitor under the IBSYS or DC-IBSYS monitor systems. Appendixes describe how jobs are set up to be run under the PRECISE sub-monitor, and how the system may be expanded to include new subroutines. The program, which responds to instructions in the form of plain English sentences or contractions thereof, has provision for handling numbers out of the normal 7090/7094 range. It handles numbers as large as 10 to the 10 to the 9 power. Other features of the program include: free-field a work-sheet of 7,500 cells (3x2S00 1nput; a work-sheet of 7,500 cells (3x2S00 user at run time (75 rows by 100 columns, 300 rows by 25 columns, (etc.); solution of systems of linear equations in as many as 85 unknowns; flexible formatting; tape handling facility; and row and column sums. A description of the UOM is included as an appendix.

    Extract: FOREWORD
    The work which is reported here was started at the National Bureau of Standards and was completed at the University of Maryland after one of the authors (AEB) transferred to that Institution. The final version of PRECISE and of the Multiple Precision Package upon which it was built was prepared at the Computer Science Center of the University of Maryland and was supported in part by grant NsG-398 of the National Aeronautics and Space Administration.  
    Extract: INTRODUCTION
    INTRODUCTION

    One of the more troublesome problems that confront the careful user of modern computers is the loss of significance resulting from round off and other computing pitfalls. In many calculations rounding errors are serious sources of annoyance -- in some they are downright fatal. While the recent trend to build computers with built-in hardware for double-precision operations is a decided help in this double-precision operations is a decided help in this guard. lie must be on guard for possible flaws in the ware, or in the algorithms and even, unhappily, for errors in important constants used by the compiler, or the errors in important constants used by the compiler .or the conversion routines.

    The problem has gotten worse recently as a consequence of the fact that many of the third generation computers have a shorter word length. As a result, programs which previously gave suitable answers in single precision now must be run in double precision.
      
    We are tempted to speculate that if the cloak of anonymity were removed from commercial software systems and each subroutine or program or compiler segment were to carry the by-line of its author or authors, then, perhaps, there the by-line of its author or authors, then, perhaps, there it may, there is a clear need for some yardsticks by which the accuracy of computer results can be judged. There is a need for a system which can deliver correct answers to a reasonably large number of significant digits even when handling exceedingly large or small numbers.
      
    The release to SHARE in 1963 of a multiple precision package (UOM MPP SHARE DIST. NO. 30Bl) by Alfred E. Beam was  a considerable boon to professional programmers using IBM 7090-94 computers. In spite of the existence of the MPP

    package and doubtless other similar packages, the problem of the MPP in multiple precision involving the elementary trigonometric and transcendental functions is still by no means a trivial job. Nor is it easy even today to solve a large system of linear equations (in say 85 unknows and reta1n adequate accuracy.

    Last place "errors" are so much a part of even reliable mathematical tables as to cause L. J. Comrie, a well-known table maker, to write a short piece entitled "What is an Error" (MTAC*, V.2, 1943, pp 284-286) in which he explains that when the seventh, eight, and ninth places in an entry that when the seventh, eight, and ninth places in an entry in a mathematical tahlc are 4,9,9 or 5,0,0, it matters little to the man who wants only seven places exactly what the tenth or eleventh place is. Thus, Comrie continues "... on more than one occasion I have written to our beloved editor saying 'I have found ... errors of less than one unit in ... tables, but am not sending them to you, lest you should be tempted to publish them.'"

    Table makers are quite willing to accept these last-place or end figure "errors" because of the tedium of carrying out check calculations to three or more figures  of beyond those that they normally carry. PRECISE carries out calculations to many figures as a matter of course. Thus, there is really no need to tolerate "end- figure" errors.

    Soon after it became clear that the philosophy behind the organization and implementation of the OMNITAB general-purpose computing program on the 7094 was sound enough to attract a wide audience of problem solvers, whom even FORTRAN had not reached, we turned our attention for a time  to the design of a comparable system for more precise calculations than were then possible in single precision. This system drew heavily on the multiple precision package designed by one of the authors to spare professional programmers the tedium of writing painstaking instructions for the computer to handle double and triple precision and out-of-range arithmetic. This report describes how the MPP package has been further employed to provide nonprogrammers with a computer tool for very precise calculations without the need to resort to conventional, and in this instance, very tedious programming.

    The PRECISE program which is discussed here was designed to carry out arithmetic operations and function generation often to as many as 28 significant digits and at the very least to 21 figures. Except when instructed to increase the ranges, the program normally handles numbers x in the range 10exp(-76) to 10exp(76) and gives results to 28 significant figures. The program can also handle numbers outside of the above range. The greatest or smallest power of 10 can be as high as plus or minus one billion. In this extreme case the results are good only to 21 figures.

    PRECISE, like its predecessor, OMNITAB, is designed to  provide a close parallel to the modus operandi in carrying out calculations with a desk calculator and a multi-columned (and multi-lined) worksheet. While the worksheet in OMNITAB for the 7094 was fixed at 101 rows by 46 columns, the 7500 cells (3 x 2500 computer words) set aside for the worksheet in PRbCISE can be dimensioned at the start of each problem at run time.
          in Proceedings of the IBM Scientific Computing Symposium on Statistics 1963 at the Thomas J. Watson Research Center in Yorktown Heights, New York, on October 21, 22 and 23,1963. view details
  • Sammet, Jean E. "Computer Languages - Principles and History" Englewood Cliffs, N.J. Prentice-Hall 1969. pp.296-299 view details
          in Proceedings of the IBM Scientific Computing Symposium on Statistics 1963 at the Thomas J. Watson Research Center in Yorktown Heights, New York, on October 21, 22 and 23,1963. view details
  • [NBS] A Systems Programmer's Guide for Implementing OMNITAB II. U.S. Department of Commerce, National Bureau of Standards. November 1970. view details
          in Proceedings of the IBM Scientific Computing Symposium on Statistics 1963 at the Thomas J. Watson Research Center in Yorktown Heights, New York, on October 21, 22 and 23,1963. view details
  • "OMNITAB II User's Reference Manual", NBS Tech Note 552 (Oct 1971). view details
          in Proceedings of the IBM Scientific Computing Symposium on Statistics 1963 at the Thomas J. Watson Research Center in Yorktown Heights, New York, on October 21, 22 and 23,1963. view details
  • Jowett, D.; Chamberlain, R. L. and Mexas, A. G. "OMNITAB--A Simple Language for Statistical Computations" The Journal of Statistical Computation and Simulation 1(2) 1972 view details
          in Proceedings of the IBM Scientific Computing Symposium on Statistics 1963 at the Thomas J. Watson Research Center in Yorktown Heights, New York, on October 21, 22 and 23,1963. view details
  • Sammet, Jean E., "Roster of Programming Languages 1972" 198 view details
          in Computers & Automation 21(6B), 30 Aug 1972 view details
  • Sammet, Jean E. "Roster of Programming Languages for 1973" p147 view details
          in ACM Computing Reviews 15(04) April 1974 view details
  • Stock, Marylene and Stock, Karl F. "Bibliography of Programming Languages: Books, User Manuals and Articles from PLANKALKUL to PL/I" Verlag Dokumentation, Pullach/Munchen 1973 427 view details Abstract: PREFACE  AND  INTRODUCTION
    The exact number of all the programming languages still in use, and those which are no longer used, is unknown. Zemanek calls the abundance of programming languages and their many dialects a "language Babel". When a new programming language is developed, only its name is known at first and it takes a while before publications about it appear. For some languages, the only relevant literature stays inside the individual companies; some are reported on in papers and magazines; and only a few, such as ALGOL, BASIC, COBOL, FORTRAN, and PL/1, become known to a wider public through various text- and handbooks. The situation surrounding the application of these languages in many computer centers is a similar one.

    There are differing opinions on the concept "programming languages". What is called a programming language by some may be termed a program, a processor, or a generator by others. Since there are no sharp borderlines in the field of programming languages, works were considered here which deal with machine languages, assemblers, autocoders, syntax and compilers, processors and generators, as well as with general higher programming languages.

    The bibliography contains some 2,700 titles of books, magazines and essays for around 300 programming languages. However, as shown by the "Overview of Existing Programming Languages", there are more than 300 such languages. The "Overview" lists a total of 676 programming languages, but this is certainly incomplete. One author ' has already announced the "next 700 programming languages"; it is to be hoped the many users may be spared such a great variety for reasons of compatibility. The graphic representations (illustrations 1 & 2) show the development and proportion of the most widely-used programming languages, as measured by the number of publications listed here and by the number of computer manufacturers and software firms who have implemented the language in question. The illustrations show FORTRAN to be in the lead at the present time. PL/1 is advancing rapidly, although PL/1 compilers are not yet seen very often outside of IBM.

    Some experts believe PL/1 will replace even the widely-used languages such as FORTRAN, COBOL, and ALGOL.4) If this does occur, it will surely take some time - as shown by the chronological diagram (illustration 2) .

    It would be desirable from the user's point of view to reduce this language confusion down to the most advantageous languages. Those languages still maintained should incorporate the special facets and advantages of the otherwise superfluous languages. Obviously such demands are not in the interests of computer production firms, especially when one considers that a FORTRAN program can be executed on nearly all third-generation computers.

    The titles in this bibliography are organized alphabetically according to programming language, and within a language chronologically and again alphabetically within a given year. Preceding the first programming language in the alphabet, literature is listed on several languages, as are general papers on programming languages and on the theory of formal languages (AAA).
    As far as possible, the most of titles are based on autopsy. However, the bibliographical description of sone titles will not satisfy bibliography-documentation demands, since they are based on inaccurate information in various sources. Translation titles whose original titles could not be found through bibliographical research were not included. ' In view of the fact that nany libraries do not have the quoted papers, all magazine essays should have been listed with the volume, the year, issue number and the complete number of pages (e.g. pp. 721-783), so that interlibrary loans could take place with fast reader service. Unfortunately, these data were not always found.

    It is hoped that this bibliography will help the electronic data processing expert, and those who wish to select the appropriate programming language from the many available, to find a way through the language Babel.

    We wish to offer special thanks to Mr. Klaus G. Saur and the staff of Verlag Dokumentation for their publishing work.

    Graz / Austria, May, 1973
          in ACM Computing Reviews 15(04) April 1974 view details
  • Swanson, J. M. and Riederer, S. "Using OMNITAB to Teach Statistics," CompUnCur, 4, 1973, pp128-134 view details
          in ACM Computing Reviews 15(04) April 1974 view details
  • Swanson, J. M., Riederer, S. A., Reynolds, E. and Harris, G. S. "IMP and SHRIMP: Small, Interactive Mimics of OMNITAB Designed for Teaching Applications," PCompScSt8, p. 84 1975 view details
          in ACM Computing Reviews 15(04) April 1974 view details
  • Rumble, John R. Jr. "OMNIDATA and the Computerization of Scientific Data" in A Century of Excellence in Measurements, Standards, and Technology A Chronicle of Selected NBS/NIST Publications, 1901-2000 NIST Special Publication 958 David R. Lide (ed) view details
          in ACM Computing Reviews 15(04) April 1974 view details
  • Croarkin, M. Carroll "Statistics and Measurements" J. Res. Natl. Inst. Stand. Technol 106(1) January?February 2001 pp279?292 view details Extract: Statistical Computing
    Statistical Computing
    The ubiquitous use of statistics at NIST has come about for many reasons, one of which is certainly the development of state-of-the-art statistical computing tools within SED. In the early 1960s, Joseph Hilsenrath of the Thermodynamics Section, Heat and Power Division, conceived the idea of a spreadsheet program for scientific calculations. Together with Joseph Cameron and the support of several NBS sections, this idea led to a program called Omnitab [5]. Omnitab is an interpretive computing system with a command structure in English that performs scientific calculations on data in columns in a worksheet.
    When Cameron became Chief of SEL, he formed a team, headed by David Hogben, to complete the development of Omnitab as a sophisticated statistical package. By 1966, it was already strong in data manipulation, regression analysis with related diagnostic graphics and tests, one and two-way analysis of variance, special functions, and matrix operations. It quickly became the standard tool for statistical calculations at NIST. It was so innovative at the time that when Brian Joiner left SEL in the 1960s to teach at Pennsylvania State University, he took a copy of Omnitab with him for his students. A few years later, Joiner formed a company that revised the code and offered it for sale as the commercial package, Minitab.
    Omnitab is strong on analytical procedures but not on graphics output. In 1969, when James Filliben brought his perspective on exploratory data analysis (EDA) to NBS, he immediately saw the need for software with strong graphics capability, and he set about developing code to support his consulting activities that incorporated the best features of EDA. There was never a steering committee for this project as there was for Omnitab, but from the breadth of problems and data encountered in the NBS laboratories, a diverse and versatile package, called Dataplot [14], was conceived. The package is a workhorse for graphical and statistical analysis at NIST and is a repository for datasets from important NIST experiments. Because it is a free and down-loadable resource maintained by the Information Technology Laboratory, Dataplot has recently been interfaced with an on-line statistics handbook that is under development within the Statistical Engineering Division and SEMATECH. From the handbook pages, the reader can run examples of statistical approaches presented in case studies in the handbook.

          in ACM Computing Reviews 15(04) April 1974 view details
    Resources
    • NIST at 100: Foundations for progress

      An Early Spreadsheet



      More than a decade before spreadsheet software helped to launch the boom in personal computers (PCs), NIST published Omnitab, a computer program for statistical and numerical analysis that had many attributes of a spreadsheet.


      Conceived by the Institute?s Joseph Hilsenrath and based in part on colleague Joseph Wegstein?s earlier work on a tabular computing scheme, Omnitab was written to automate routine programming tasks?such as handling data input and output and producing graphs?for NIST physicists, chemists, and engineers, who were thus freed to concentrate on higher level science. Omnitab proved so helpful that its use extended far beyond NIST. For about 10 years after its 1966 publication, it was popular with statisticians in agricultural research, private industry, and universities. Foreign-language editions appeared as well; the program could accept simple commands in French, German, and even Japanese.


      Omnitab was like a spreadsheet because it had an extensive and accurate math facility, a macro language, and a graphical output. Most importantly, it created a tableau in which the entries were calculated from input values. However, operations were defined for entire columns. VisiCalc, the ?killer application? business spreadsheet unveiled in 1979 for PCs, allowed functions to be entered in individual cells and was more
      dynamic and interactive.


      Omnitab initially used an old programming language and did not migrate to PCs until after its heyday. But its influence persists today in the form of Minitab, a PC-based commercial software package for teaching statistics and for research in business and manufacturing.


      external link
    • Not the First Spreadsheet

      Not the First Spreadsheet


      You keep printing the statement that Dan Bricklin wrote the first spreadsheet program ("The 20 Most Important People," September). No doubt VisiCalc was the first successful commercial spreadsheet, but hardly the first spreadsheet. In the early '70s, I used a mainframe program called Omnitab II, from what was then the National Bureau of Standards. It used a fully developed spreadsheet metaphor, but given the scientific and engineering emphasis of the program, it was referred to as a "lab notebook." The mathematical facility was extensive and accurate. It had a macro language, and it produced graphical output. In short, it had all the attributes of the modern spreadsheet program.



      Steve Tedder
      external link