Technical Reports

The ICICS/CS Reading Room


1980 UBC CS Technical Report Abstracts

TR-80-01 Stiff Stability Considerations for Implicit Runge-Kutta Methods, January 1980 James M. Varah

In this paper, we discuss some recent stiff stability concepts introduced for implicit Runge-Kutta methods, and focus on those properties which are really important from a practical point of view. In particular, the concept of stiff decay is introduced and examined, and applied to the common IRK methods available. The Radau IIA formulas are shown to be particularly useful from this point of view.

TR-80-02 Reformulation of Boundary Value Problems in ``Standard'' Form, February 1980 Uri Ascher and Robert D. Russell

Boundary value problems in ODE's arising in various applications are frequently not in the ``standard'' form required by the currently existing software. However, many problems can be converted to such a form, thus enabling the practitioner to take advantage of the availability and reliability of this general purpose software. Here, various conversion devices are surveyed.

TR-80-03 Automating Physical Reorganizational Requirements at the Access Path Level of a Relational Database System, March 1980 Grant Edwin Weddell

Any design of an access path level of a database management system must make allowance for physical reorganization requirements. The facilities provided for such requirements at the access path level have so far been primitive in nature (almost always, in fact, requiring complicated human intervention). This thesis begins to explore the notion of increasing the degree of automation of such requirements at the access path level; to consider the practical basis for self-adapting or self-organizing data management systems. Consideration is first given to the motivation (justification) of such a notion. Then, based on a review of the relevent aspects of a number of existing data management systems, we present a complete design specification and outline for a proposed access path level. Regarding this system we consider in detail the automation of two major aspects of physical organization: the clustering of records on mass storage media and the selection of secondary indices. The results of our analysis of these problems provides a basis for the ultimate demonstration of feasibility of such automation.

TR-80-04 On the Covering Relaxation Approach for the 0-1 Positive Polynomial Programming Problem, May 1980 Willem Vaessen

Covering relaxation algorithms were first developed by Granot et al for solving positive 0-1 polynomial programming (PP) problems which maximize a linear objective function in 0--1 variables subject to a set of polynomial inequalities containing only positive coefficients [``Covering Relaxation for Positive 0--1 Polynomial Programs'', Management Science, Vol. 25, (1979)]. The covering relaxation approach appears to cope successfully with the non-linearity of the PP problem and is able to solve modest size (40 variables and 40 constraints) sparse PP problems. This thesis develops a more sophisticated covering relaxation method which accelerates the performance of this approach, especially when solving PP problems with many terms in a constraint. Both the original covering relaxation algorithm and the newly introduced accelerated algorithm are cutting plane algorithms in which the relaxed problem is the set covering problem and the cutting planes are linear covering constraints. In contrast with other cutting plane methods in integer programming, the accelerated covering relaxation algorithm developed in this thesis does not solve the relaxed problem to optimality after the introduction of the cutting plane constraints. Rather, the augmented relaxed problem is first solved approximately and only if the approximate solution is feasible to the PP problem is the relaxed problem solved to optimality. The promise of this approach stems from the excellent performance of approximate procedures for solving integer programming problems. Indeed, the extensive computational experiments that were performed show that the accelerated algorithm has reduced both the number of set covering problems to be solved and the overall time required to solve a PP problem. The improvements are particularly significant for PP problems with many terms in a constraint.

TR-80-05 Optimal Load Control in Combined Batch-Interactive Computer Systems, April 1980 Samuel T. Chanson and Prem Swarup Sinha

(Abstract not available on-line)

TR-80-06 On the Integrity of Typed First Order Data Bases, April 1980 Raymond Reiter

A typed first order data base is a set of first order formulae, each quantified variable of which is constrained to range over some type. Formally, a type is simply a distinguished monadic relation, or some Boolean combination of these. Assume that with each data base relation other than the types is associated an integrity constraint which specifies which types of individuals are permitted to fill the argument positions of that relation. The problem addressed in this paper is the detection of violations of these integrity constraints in the case of data base updates with universally quantified formulae. The basic approach is to first transform any such formula to its so-called reduced typed normal form, which is a suitably determined set of formulae whose conjunction turns out to be equivalent to the original formula. There are then simple criteria which, when applied to this normal form, determine whether that formula violates any of the argument typing integrity constraints.

TR-80-07 Some Representational Issues in Default Reasoning, August 1980 Raymond Reiter and Giovanni Criscuolo

Although most commonly occurring default rules are normal when I viewed in isolation, they can interact with each other in ways that lead to the derivation of anomalous default assumptions. In order to deal with such anomalies it is necessary to re-represent these rules, in some cases by introducing non-normal defaults. The need to consider such potential interactions leads to a new concept of integrity, distinct from the conventional integrity issues of first order data bases. .br The non-normal default rules required to deal with default interactions all have a common pattern. Default theories conforming to this pattern are considerably more complex than normal default theories. For example, they need not have extensions, and they lack the property of semi-monotonicity. .br Current semantic network representations fail to reason correctly with defaults. However, when viewed as indexing schemes on logical formulae, networks can be seen to provide computationally feasible heuristics for the consistency checks required by default reasoning.

TR-80-08 A Spline Least Squares Method for Numerical Estimation in Differential Equations, January 1980 James M. Varah

In this paper, we describe a straightforward least squares approach to the problem of finding numerical values for parameters occurring in differential equations, so that the solution best fits some observed data. The method consists of first fitting the given data by least squares using cubic spline functions with knots chosen interactively, and then finding the parameters by least squares solution of the differential equation sampled at a set of points. We illustrate the method by three problems from chemical and biological modelling.

TR-80-09 Why Is a Goto Like a Dynamic Vector in the BCPL-Slim Computing System, November 1980 Harvey Abramson

The Slim computer is a new virtual machine which can be used in the translation and porting of the BCPL compiler, and eventually, in the porting of an operating system written in BCPL. For the purposes of this paper, the Slim computer is a stack machine with a single accumulator and a register which points to the top of the stack. The procedures LEVEL and LONGJUMP, traditionally used to implement transfers of control across BCPL procedures, and which are usually written in the assembler language of a host machine, cannot be used with this architecture. In developing procedures to implement \underline{all} transfers of control, we show how these essential procedures --- though highly dependant on the slim architecture --- can be written portably in BCPL, and discover an interesting connection between implementing jumps and dynamic vectors (by means of Aptovec) in the BCPL-Slim computing system. Some parameters of portability in rapping an abstract machine to host machines are identified, and it is shown how to maintain the portability of the above mentioned procedures in the face of various mapping problems. Finally, we are led to a comment on the design of BCPL to the effect that \underline{goto}'s are an unnecessary feature of the language.

TR-80-10 Automatic Rectification of Landsat Durages Using Features Derived from Digital Terrain Models, December 1980 James Joseph Little

Before two images of the same object can be compared, they must be brought into correspondence with some reference datum. This process is termed registration. The reference can be one of the images, a synthetic image, a map or other symbolic representation of the area imaged. A novel method is presented for automatically determining the transformation to align a Landsat image to a digital terrain model, a structure which represents the topography of an area. Parameters of an affine transformation are computed from the correspondence between features of terrain derived from the digital terrain model, and brightness discontinuities extracted from the Landsat image.


If you have any questions or comments regarding this page please send mail to help@cs.ubc.ca.