PLUSS + Ada
in Alfonso Miola, editor, Proceedings Design and Implementation of Symbolic Computation (DISCO 90), volume 429 of Springer LNCS, pages 205-214, April 1990 view details
Software development with abstract data types is usually done by first specifying the types and next implementing them. Specifications may be formal (equational) or not, and in the first case they may be executable or not. In this framework, two main questions arise: how can we prove the correctness of an implementation with respect to its specification? and, is there any help to transform a specification in an implementation?
Roughly speaking, we may distinguish between formal verification and testing for the first question. Formal verification is better than testing because it assures the correctness of programs; however, it may be difficult to carry out, even with the help of appropriate tools, and this is why testing is still in use. Concerning the second question, specifications and implementations may be kept separated in different stages, interfering with software evolution, or else they may be joined together with a multiparadigm (wide-spectrum) language (which may be really a single language or else the combination of two or more existing languages into a single framework); in the second case, evolution may be performed with the help of some transformation rules or not.
In this paper, we propose a multiparadigrn language combining equational specifications and imperative programs into a single framework. Even more, we require specifications to be operational (executable), and thus we adopt a testing approach to implementation correctness. To provide programmers with a great flexibility, we build an execution procedure that permits execution of software that combines specifications and imperative programs in an arbitrary manner; as a result, we may adopt a prototyping software development model in which a program is obtained by applying a sequence of refinement steps starting from a formal, possibly incomplete specification of the problem, being able to prototype the product at every stage of development by combining a term-rewriting system and an interpreter.
We remark that our prototyping strategy does not prevent the use of a transformational approach. On the other hand, this kind of testing may be sometimes employed as a complement of formal verification to show the execution behaviour of a piece of software (for instance, to study interfaces with the user). Also, we remark the flexibility of the strategy, that does not oblige neither the specification of a data type before its implementation (we may think in those procedural aspects which specification really looks as an implementation) nor any fixed "implementation rate" (implementations may be built and tested at once, or function by function); in any case, execution is always possible.
The rest of the paper is organised as follows. In section 2, we present the multiparadigm notation used in the paper. In section 3, it is shown how may we customise the execution environment. In section 4, the execution procedure is presented from some demonstrably correct transformation rules, and an example is shown in section 5. Finally, section 6 remarks some undesirable issues of the work, and section 7 gives the conclusions and future work. Extract: A multiparadigm language
A multiparadigm language
The ideas we present in this paper are intended not to be tied to any concrete programming language: they may be applied to a specification language as OBJ [FGMJ85] or ACT ONE [EM85], linked with any imperative programming language as Ada, Modula or O.-O. ones. Also, they may be applied to any equational enrichment of an imperative programming language (Ada with equations, as Anna [LH85]), or (with a few modifications) even to logic programming combined with imperative programming (as in [DDL88, Rad90]).
To be as general as possible, we use in the paper an ad hoc notation based on the concept of abstract data type as design unit. It allows: on the one hand, formal, possibly incomplete specification of types as a previous step to their implementation; on the other hand, gradual implementation of those types supported by a programming environment, which is able to incrementally execute programs, even if they do combine specified types with implemented ones. Data type encapsulation mechanisms are called universes; they may define new types (probably parameterised), or enrichments of types by adding new operations on some existing ones; depending on their contents, we talk about specification universes or implementation universes.
Specification universes present some constructs to define type signatures (that is, types' names -called sorts- and operations) and properties. Properties are stated in equational style; equations may be conditional, the conditions in turn being equations. The logic to express equations allows to associate initial semantics to specifications [GTW78] and thus it is possible to use a term-rewriting system to manipulate terms during execution of programs. Implementation universes' notation presents Ada-like programming language constructs. Each implementation universe must be referred to a specification one (which presents at least the signature of the type) and it may include data structures and/or functions with no side-effects. In many contexts, prototyping can take place only if there exists an abstraction function [Hoa72] to find out which value of a type is represented by a particular state of the implementation. Thus, implementation universes should provide this function in order to be able to prototype the execution in any intermediate state; in spite of being defined in implementation universes, the abstraction function may be stated either in equational or imperative style.
In both kinds of universes some structuring mechanisms can be employed. The basic one is the use of specifications, which makes all the symbols exported by a specification universe available to its users. There is also a parameterisation facility to define generic universes, coupled with a parameter passing mechanism (instantiation); formal parameters are defined in the so-called characterisation universes. Last, there are some constructs to control the scope of symbols in specifications: symbols may be renamed, hidden or introduced as private from the very beginning.
As an example, we give the highlights of the specification of a generic type for directed, unlabelled graphs and its implementation with adjacency matrix. In the specification, the parameters are the node's type el em and an equality operator eq (to specify membership), both of them defined in the characterisation universe ELEM_= (the first one as a sort and the second one as an operation which properties are stated), empty and add are the so-called generating operations, which "generate" all the values of the type and that serve as the basis for the specification of the other operations in the initial semantics framework. Note the use of specifications of booleans, the definition of auxiliary functions, the instantiation and renaming of a generic type for lists in order to define the signature, the specification of erroneous situations and the use of conditional equations. The abstraction function is given in imperative style; when this is the case, a variable of a generic type term is used to hold the result. Also, a representation invariant is provided, which may be used during execution.
In the general case, there may be many levels of implementations; for instance, graphs may be also implemented by (adjacency) lists, which may implemented by arrays in turn. Anyway, the bottom level consists of predefined types like arrays and integers.
Last, let's make clear some language concepts from the execution point of view. First of all, there is no main program in applications; instead, we may pick up any function to be the subject of execution at any time. As a result, we may assimilate the notion of execution to the idea of evaluating a term, that is, the application of a function over its parameters. Throughout the paper, a term is called an expression when it is used in a context that requires its evaluation; note that an expression may contain many proper sub expressions. Inside expressions, objects do appeal- which will be evaluated in the appropriate moments (we denote by the word object any of the following entities: variables, parameters, constants -predefined or user-defined- and function results). As a key point during execution, objects' value may be either a data structure or a term; in the first case, we talk about implemented objects, while we call specified
objects the second ones. Note that objects may be intertwined due to the existence of many levels of implementation; that is, an object of type graph may be implemented with an array, being their components of type list which may be specified (if there not exist an implementation for it), and elements inside lists (for instance, integers) may be implemented in turn. In this case we talk about hybrid objects. The figure below illustrates this situation.
Extract: Conclusions and future work
Conclusions and future work
A framework for software development by means of prototyping has been presented. We use an abstract data type oriented language that combines equations and imperative programs, and thus software refinement consists of gradually transforming the specified types into implemented ones, being able to prototype the application at any stage of development, provided that specifications are operational and that an abstraction function and a representation invariant do exist for every implementation.
Being evaluation of expressions the main issue of execution, we have identified many rules to transform an expression assuring that its value is preserved; the crucial fact is how specified and implemented functions and values coexist. Then, we have formulated an execution procedure which gives preference to testing over efficiency and which involves a term rewriting system to evaluate specified functions and an interpreter for imperative ones.
Other projects do exist that support execution of equations mixed with imperative programs. The closest to ours is the execution system of Asspegique [CK90], which combines PLUSS specifications with Ada implementations. There is a basic difference, however: mixed execution relies on the compilation of equations into Ada and in the generation of packages for the involved modules. We think that, in the testing framework, compilation prevents the easy interaction between the user and the execution system, as well as quick and interactive modifications of the rewrite rules and customisation of the environment. This compilation scheme has been adopted by many other teams [DDL88, Rad90].
About the future work, we want to allow mixed execution even if more than one implementation for the same type is used in different contexts of the same program. This issue has been addressed in [Fra94] and the integration into the current work seems to be natural: no new constructions are required over the language (the abstraction function is enough) and initial semantics still provides a valid formal framework. On the other hand, efficiency of computations should be improved. Mainly, we are studying the interest of sharing of subterms and the advantages of maintaining a twofold value in objects (the result of an abstraction and the value before abstraction). We are also considering the possibility of a lazy implementation of the final evaluation, which would be possible due to the sharing of subterms.
in Procs. 7th International Symposium on Programming Languages: Implementations, Logics and Programs (PLILP), Utrecht (The Netherlands), September 1995, LNCS 982, view details
The present volume is an annotated bibliography which attempts to provide an up-to-date overview of past and present work on algebraic specification. No attempt is made to provide a coherent introduction to the topic for beginners the intention is rather to provide a guide to the current literature for researchers in algebraic specification and neighbouring fields. Some indications of how the different approaches are related are included. ps Extract: Extended ML
The long-term goal of work on Extended ML is to provide a practical framework for formal development of Standard ML programs together with an integrated suite of computer-based specification and development support tools and complete mathematical foundations to substantiate claims of correctness. The complete formal definition of the Extended ML language  constitutes an important milestone in this programme, necessary to provide a basis for further research on foundations and tools. One of the key issues that had to be resolved was the design of a logic suitable for describing properties of Standard ML programs, taking into account features like recursive type and function definitions, exceptions, polymorphism, and non-termination. A number of errors in the Standard ML semantics  have been discovered in the course of this work . Work has begun on rules for proving theorems about EML specifications, to be used in verifying correctness conditions that arise in the course of formal development. Research has so far concentrated on equality reasoning  and quantifier rules.
The length and requisite formality of the definition of Extended ML renders it rather difficult to penetrate. Accordingly, [532, 534] provides an informal overview of the definition, explaining most of the main issues involved and justifying some of the choices taken.
Extract: Asspegique and Asspegique+
ASSPEGIQUE+ is an integrated environment for the development of large algebraic specifications and the management of a specification data base. The ASSPEGIQUE"1" specification environment supports the specification langage PLUSS [104, 93, 368]. The previous version of this environment, ASSPEGIQUE [100, 98] supported only the enrichment primitive, while ASSPEGIQUE+ [209, 819] also supports the renaming, parameterization and instanciation primitives. The tools available in the ASSPEGIQUE"1" specification environment include browsing tools, a specification editing tool, a specification modules integrator, symbolic evaluation tools and interfaces to other tools or environments. Most of these tools use CIGALE , a system for incremental grammar construction and parsing, which has been especially designed in order to cope with a flexible, user-oriented way of defining operators, with coercion and overloading of operators.
While the aim of the ASSPEGIQUE+ environment is to provide the user with a wide range of tools, this is achieved both through specific ASSPEGIQUE+ tools and through interfaces to tools built by other colleagues. Interfaces between ASSPEGIQUE and REVE  and SLOG  were first built, then a generic way to provide inter-operability between tools was designed [102, 211, 832]. The principles developed for designing a generic interfacing design between algebraic specification tools or environments comprise both issues concerning a common interchange format, and issues regarding interaction with the user, that is how to get and display information using the specification language known to the user. These ideas were used for developing interfaces with the proof checker LP , and, within the SALSA project , interfaces with environments for the algebraic specification languages LPG  and GLIDER.
The demand for reusability originates from economic considerations and attempts towards standardization. Rather than always starting from scratch, the use of existing components is cheaper and also less error-prone. A central problem for the identification and the correct use of reusable components is the abstract description of such components. A formal specification is the only form of description that can serve as a basis for a correctness proof; it can be processed automatically and it establishes a degree of confidence in the functionality of the component that is particularly important if a component has to be modified before being reused.
Goguen  proposes the algebraic specification language OBJ as well-suited for the design of reusable software systems. A component's interface is specified as an abstract data type and may be parameterized by other components. Combination of components is possible by instantiation using appropriate fitting morphisms. A similar approach is used in Extended ML , .
In ACT TWO  components are modules, which consist of two interface specifications, i.e. an export specification and an import specification, and a body specification which implements the export specification in terms of the import specification. Similarly, in LARCH , a component consists of a pair, an "abstract" interface specification and an implementation. Here the implementation is written in a conventional programming language. A similar distinction between a requirement and a design specification is made in PAnndA-S, the language of the PROSPECTRA project , based on the notion of visible and private interface in Ada. Ada bodies are generated semi-automatically by transformation.
In the approach of  a component consists of four parts at four different levels of abstraction: a requirement specification, a design specification, a source program and object code. Components are written in a so-called wide spectrum language which provides facilities for system design and software development at many levels of abstraction. Matsumoto uses Ada, which has as a consequence that the requirement specification is merely an informal description. CIP-L  and COLD  are two languages which include algebraic specifications as well as predicative and imperative language styles in one coherent framework and therefore are better suited for such an approach.
In the object-oriented approach of  a reusable component is represented by a graph of object-oriented programs, each node of which stands for one level of abstraction in the description of the software. In the approach of the ESPRIT-project DRAGON , a reusable component consists of a tree of algebraic specifications, where similarly to , each node of the tree is an instance of a certain level of an abstraction of the program with the root being the most "abstract" formal specification and the leaves representing the "concrete" implementations. A methodology for reusing components is described in .
in Procs. 7th International Symposium on Programming Languages: Implementations, Logics and Programs (PLILP), Utrecht (The Netherlands), September 1995, LNCS 982, view details