Coulter, Neal S.

Person Preferred Name
Coulter, Neal S.
Model
Digital Document
Publisher
Florida Atlantic University
Description
Content analysis is used to investigate the essence of the Software Engineering Institute's Capability Maturity Model (CMM) through associated software process evaluation instruments. This study yields lexical maps of key terms from each questionnaire. The content analysis is studied in three possible ways for each of the questionnaires: By question, by key process area, and by maturity level. These maps are named suitably. Super network and distribution maps are used for finding relations among the maps. Analysis of the key terms from the maps are compared to extract the essence of CMM and the ability of the questionnaires to adequately assess an organization's process maturity.
Model
Digital Document
Publisher
Florida Atlantic University
Description
This thesis involves original research in the area of semantic analysis of textual databases (content analysis). The main intention of this study is to examine how software engineering practices can benefit from the best manufacturing practices. There is a deliberate focus and emphasis on competitive effectiveness worldwide. The ultimate goal of the U.S. NAVY's Best Manufacturing Practices Program is to strengthen the U.S. industrial base and reduce the cost of defense systems by solving manufacturing problems and improving quality and reliability. Best manufacturing practices can assist software engineering practices in a way that when software companies use these practices they can: (1) Improve both software quality and staff productivity; (2) Determine the current status of the organization's software process; (3) Set goals for process improvement; (4) Create effective plans for reaching those goals; (5) Implement the major elements of the plans.
Model
Digital Document
Publisher
Florida Atlantic University
Description
A database based on best manufacturing practices documents is explored relative to the "design" component of such best practices. This content analysis study yields lexical maps of key terms. Maps are categorized depending on their node arrangement in the following main groups: Cyclic maps, Sequential maps, Spoke-patterned maps and Isolated maps. A comparative study of names selected by different naming algorithms helps us in establishing the relation between map patterns and naming algorithms. Super network and distributions maps generated from the same database are useful in defining relations among the different maps. This study outlines the overall structure of design process through the maps' names and their interrelations.
Model
Digital Document
Publisher
Florida Atlantic University
Description
The history of software development reflects a continuous series of problems, crises and triumphs in the development of reliable software systems. Problems with comprehension of machine language led to assemblers and high level languages, and eventually to the discipline of structured programming. Problems with program and system size led to modularity and modular design. None of these solutions proved to be final because aspirations have risen along with competence. This thesis makes the argument that the increasing size of projects, in terms of their complexity and the numbers of persons required to bring them to fruition, gives rise to a set of problems caused by the social interaction of those persons. This social context is investigated. It is argued that solutions ignoring this social context are inadequate for solving the software crisis brought on by the increasing demand for larger software systems.
Model
Digital Document
Publisher
Florida Atlantic University
Description
Through a small contained environment, this study evaluates an information-based complexity metric theory and its relationship to the effort expended in constructing a program. The metric, which calculates the amount of information present in a program specification, determines the specification's complexity measure. The observed measures of programmer effort were the numbers of keystrokes, insertions, deletions, and runs needed to complete the program specification. It was theorized that a program with a higher complexity value than that of another program will require more programmer resources to complete. A significant relationship between the metric and the number of keystrokes was found.
Model
Digital Document
Publisher
Florida Atlantic University
Description
This thesis gives an evaluation of DOS 3.2 file system performance in the current IBM PC AT environment, and it presents a survey of alternative file system and high density storage integration strategies. The current file system is evaluated to determine the nature of its algorithms and structures. In particular, the file system is examined from a disk access perspective and from the perspective of alternative disk and file management strategies used in UNIX*2 file systems.
Model
Digital Document
Publisher
Florida Atlantic University
Description
This investigation consisted of analyzing the source code of
one of the IBM Series/1's operating systems, the Event
Driven Executive, and one of its Application Program
Products, the Multiple Terminal Monitor, for combinations of
existing hardware mnemonic instructions. Such instructions
could possibly be vertically migrated into new single
mnemonic hardware instructions, thus improving system
performance. Two pairs of instructions were found which
seem to be strong candidates for such vertical
migration. Many instructions available on the Series/1 were
not used in any of the modules examined. This suggests that
some hardware instructions could be eliminated.
Model
Digital Document
Publisher
Florida Atlantic University
Description
As the software industry continues to mature, best practices have been defined to improve the quality and productivity of software development. Adoption rates of these practices are better understood for large organizations as compared to small standalone organizations. Based on surveys and interviews, this study analyzes the utilization of software development best practices by small software organizations, and the factors that influence different levels of adoption. The results demonstrate that context is the main determinant of adoption, driven by competitive pressures, organizational incentives, and prior exposure to best practices. Consulting firms were influenced by their contractual relationship with the client, which led to more focus on deliverables. Traditional product companies emphasized code management practices, though their focus on domain expertise detracted from software development best practices. Finally, startups were characterized by experienced senior managers who balanced quality assurance against pressures for a quick time to market release.
Model
Digital Document
Publisher
Florida Atlantic University
Description
Accompanying the potential increase in power offered by parallel computers is an increase in the complexity of program design, implementation, testing and maintenance. It is important to understand the logical complexity of parallel programs in order to support the development of concurrent software. Measures are needed to quantify the components of parallel software complexity and to establish a basis for comparison and analysis of parallel algorithms at various stages of development and implementation. A set of primitive complexity measures is proposed that collectively describe the total complexity of parallel programs. The total complexity is separated into four dimensions or components: requirements, sequential, parallel and communication. Each proposed primitive measure is classified under one of these four areas. Two additional possible dimensions, fault-tolerance and real-time, are discussed. The total complexity measure is expressed as a vector of dimensions; each component is defined as a vector of primitive metrics. The method of quantifying each primitive metric is explained in detail. Those primitive metrics that contribute to the parallel and communications complexity are exercised against ten published summation algorithms and programs, illustrating that architecture has a significant effect on the complexity of parallel programs--even if the same programming language is used. The memory organization and the processor interconnection scheme had no effect on the parallel component, but did affect the communication component. Programming style and language did not have a noticeable effect on either component. The proposed metrics are quantifiable, consistent, and useful in comparing parallel algorithms. Unlike existing parallel metrics, they are general and applicable to different languages, architectures, algorithms, paradigms, programming styles and stages of software development.