- TR-82-01 Representing Techniques in Computer System Design \& Load Control, March 1982 Robert Ernest Mercer and Raymond Reiter
This paper is a first step towards the computation of an inference based on \underline{language use}, termed \underline{presupposition}. Natural languages, unlike formal languages, can be \underline{semantically ambiguous}. These ambiguities are resolved according to \underline{pragmatic rules}. We take the position that presuppositions are inferences generated from these pragmatic rules. Presuppositions are then used to generate the \underline{preferred interpretation} of the ambiguous natural language sentence. A preferred interpretation can be circumvented by an explicit inconsistency. This paper discusses the appropriateness of using \underline{default rules} (Reiter(1980)) to represent certain common examples of presupposition in natural language. We believe that default rules are not only appropriate for representing presuppositions, but also provide a formal explanation for a precursory consistency-based presuppositional theory (Gazdar(1979)).

- TR-82-02 On Fitting Exponentials by Nonlinear Least Squares, January 1982 James M. Varah
This paper is concerned with the problem of fitting discrete data, or a continuous function, by least squares using exponential functions. We examine the questions of uniqueness and sensitivity of the best least squares solution, and provide analytic and numerical examples showing the possible non-uniqueness, and extreme sensitivity of these solutions.

- TR-82-03 The Complexity of Regular Expressions with Goto and Boolean Variables, March 1982 Karl Abrahamson
Regular expressions can be extended by adding gotos and Boolean variables. Although such extensions do not increase the class of expressible languages, they do permit shorter expressions for some languages. The output space complexity of eliminating Boolean variables is shown to be double exponential. The complexity of eliminating a single goto from a regular expression is shown to be \( \Omega (n \log n) \), a surprising result considering that $n$ gotos can be eliminated in single exponential space.

- TR-82-04 Collocation for Singular Perturbation Problems II, May 1982 Uri Ascher and Weiss R.
We consider singularly perturbed linear boundary value problems for ODES, with variable coefficients, but without turning points. Convergence results are obtained for collocation schemes based on Gauss and Lobatto points, showing that highly accurate numerical solutions for these problems can be obtained at a very reasonable cost using such schemes, provided that appropriate meshes are used. The implementation of the numerical schemes and the practical construction of corresponding meshes are discussed. .br These results extend those of a previous paper which deals with systems with constant coefficients.

- TR-82-05 A Regression Model of a Swapping System, July 1982 Samuel T. Chanson
This paper describes a measurement experiment performed on a PDP 11/45 system running under UNIX (version six) which employs swapping rather than paging in managing memory. Regression equations relating the system's responsiveness to certain system and workload parameters are obtained. Sample applications such as predicting the system's performance due to workload and system changes, load control as well as representing the swapping behaviour in simulation and analytic models are presented. The similarities between the paging and swapping dynamics are discussed. The paper also includes a brief discussion of the accuracy of the model as well as the advantages and disadvantages of the regression technique.

- TR-82-06 The Complexity of Some Polynomial Network Consistency Algorithms for Constraint Satisfaction Problems, August 1982 Alan K. Mackworth and E. C. Freuder
Constraint satisfaction problems play a central role in artificial intelligence. A class of network consistency algorithms for eliminating local inconsistencies in such problems has previously been described. In this paper we analyze the time complexity of several node, arc and path consistency algorithms. Arc consistency is achievable in time linear in the number of binary constraints. The Waltz filtering algorithm is a special case of the arc consistency algorithm. In that computational vision application the constraint graph is planar and so the complexity is linear in the number of variables.

- TR-82-07 Unification Based Conditional Binding Constructs, September 1982 Harvey Abramson
The unification algorithm, heretofore used primarily in the mechanization of logic, can be used in applicative programming languages as a pattern matching tool. Using SASL (St. Andrews Static Language) as a typical applicative programming language, we introduce several unification based conditional binding (ie, pattern matching) constructs and show how these can promote clarity and conciseness of expression in applicative languages, and we also indicate some applications of these constructs. In particular, we present an interpreter for SASL functions defined by recursion equations.

- TR-82-08 Performance of Some Local Area Network Technologies, August 1982 Samuel T. Chanson, A. Kumar and A. Nadkarni
This paper classifies local area network (LAN) technologies according to their topology and access method. The characteristics of the popular LAN technologies (namely Ring/Token passing, Ring/Message slots and Bus/Contention) are discussed. Analytic models are developed to estimate the mean packet delay time of each technology as a function of the network loading for various packet sizes and number of active stations. It is found that in the case of slotted rings (but not the other two technologies) an optimal value of the number of active stations exists which minimizes the mean delay time at all load levels given a packet arrival rate. The LAN technologies are compared with regard to their performance, reliability, availability, maintainability, extensibility, fairness and complexity. .br It is hoped that potential users may be able to select the appropriate technology for their intended applications based or their specific performance requirements and operation environment. As well, LAN designers may benefit from the insight provided with the analysis.

- TR-82-09 Collocation for Singular Perturbation Problems III: Nonlinear Problems without Turning Points, October 1982 Uri Ascher and R. Weiss
A class of nonlinear singularly perturbed boundary value problems is considered, with restrictions which allow only well-posed problems with possible boundary layers, but no turning points. For the numerical solution of these problems, a close look is taken at a class of general purpose, symmetric finite difference schemes arising from collocation. .br It is shown that if locally refined meshes, whose practical construction is discussed, are employed, then high order uniform convergence of the numerical solution is obtained. Nontrivial examples are used to demonstrate that highly accurate solutions to problems with extremely thin boundary layers can be obtained in this way at a very reasonable cost.

- TR-82-10 Standard Image Files, January 1982 William S. Havens
(Abstract not available on-line)

- TR-82-11 Multi Process Structuring of X.25 Software, October 1982 Stephen Edward Deering
Modern communication protocols present the software designer with problems of asynchrony, real-time response, high throughput, robust exception handling, and multi-level interfacing. An operating system which provides lightweight processes and inexpensive inter-process communication offers solutions to all of these problems. This thesis examines the use of the multi-process structuring facilities of one such operating system, Verex, to implement the protocols defined by CCITT Recommendation X.25. The success of the multi-process design is confirmed by a working implementation that has linked a Verex system to the Datapac public network for over a year. .br The processes which make up the Verex X.25 software are organized into layers according to the layered definition of X.25. Within the layers, some processes take the form of finite-state machines which execute the state transitions specified in the protocol definition. Matching the structure of the software to the structure of the specification results in software which is easy to program, easy to understand, and likely to be correct. .br .br Multi-process structuring can be applied with similar benefits to protocols other than X.25 and systems other than Verex.

- TR-82-12 Knowledge-Based Visual Interpretation Using Declarative Schemata, November 1982 Roger A. Browse
One of the main objectives of computer vision systems is to produce structural descriptions of the scenes depicted in images. Knowledge of the class of objects being imaged can facilitate this objective by providing models to guide interpretation, and by furnishing a basis for the structural descriptions. This document describes research into techniques for the representation and use of knowledge of object classes, carried out within the context of a computational vision system which interprets line drawings of human-like body forms. .br A declarative schemata format has been devised which represents structures of image features which constitute dep- ictions of body parts. The system encodes relations between these image constructions and an underlying three dimensional model of the human body. Using the component hierarchy as a structural basis, two layers of representation are developed. One references the fine resolution features, and the other references the coarse resolution. These layers are connected with links representative of the specialization/generalization hierarchy. The problem domain description is declarative, and makes no commitment to the nature of the subsequent interpretation processes. As a means of testing the adequacy of the representation, portions have been converted into a PROLOG formulation and used to ``prove'' body parts in a data base of assertions about image properties. .br The interpretation phase relies on a cue/model approach, using an extensive cue table which is automatically generated from the problem domain description. The primary mechanisms for control of interpretation possibilities are fashioned after network consistency methods. The operation of these mechanisms is localized and separated between operations at the feature level and at the model level. .br The body drawing interpretation system is consistent with aspects of human visual perception. The system is capable of intelligent selection of processing locations on the basis of the progress of interpretation. A dual resolution retina is moved about the image collecting fine level features in a small foveal area and coarse level features in a wider peripheral area. Separate interpretations are developed locally on the basis of the two different resolution levels, and the relation between these two interpretations is analyzed by the system to determine locations of potentially useful information.

- TR-82-13 A Cooperative Scheme for Image Understanding Using Multiple Sources of Information, November 1982 Jay Glicksman
One method of resolving the ambiguity inherent in interpreting images is to add different sources of information. The multiple information source paradigm emphasizes the ability to utilize knowledge gained from one source that may not be present in another. However, utilizing disparate information may create situations in which data from different sources are inconsistent. .br A schemata-based system has been developed that can take advantage of multiple sources of information. Schemata are combined into a semantic network via the relations decomposition, specialization, instance of, and neighbour. Control depends on the structure of the evolving network and a cycle of perception. Schemata cooperate by message passing so that attention can be directed where it will be most advantageous. .br This system has been implemented to interpret aerial photographs of small urban scenes. Geographic features are identified using up to three information sources: the intensity image, a sketch map, and information provided by the user. The product is a robust system where the accuracy of the results reflects the quality and amount of data provided. Images of several geographic locales are analyzed, and positive results are reported.

If you have any questions or comments regarding this page please send mail to help@cs.ubc.ca.