Data compression (Computer science)

Model
Digital Document
Publisher
Florida Atlantic University
Description
In this thesis, we measure and analyze the effects of compression in a demand paging operating system. We first explore existing compression algorithms and page replacement policies currently in use. Then we examine the OS/2 operating system which is modified to include page-based compression. Software trace hooks are inserted into the operating system to determine the amount of time required to process a page fault for each type of page, e.g. non-compressed, compressed, zero-filled, and the number of page faults for each type of page. Software trace measurements as well as physical timings are taken on a system without compressed pages and the same system with compressed pages. We find the system with compressed pages shows a slight increase in paging activity for memory constrained systems, but performance (time) is improved in both memory constrained and unconstrained systems.
Model
Digital Document
Publisher
Florida Atlantic University
Description
A barrier to the use of digital imaging is the vast storage requirements involved. One solution is compression. Since imagery is ultimately subject to human visual perception, it is worthwhile to design and implement an algorithm which performs compression as a function of perception. The underlying premise of the thesis is that if the algorithm closely matches visual perception thresholds, then its coded images contain only the components necessary to recreate the perception of the visual stimulus. Psychophysical test results are used to map the thresholds of visual perception, and develop an algorithm that codes only the image content exceeding those thresholds. The image coding algorithm is simulated in software to demonstrate compression of a single frame image. The simulation results are provided. The algorithm is also adapted to real-time video compression for implementation in hardware.
Model
Digital Document
Publisher
Florida Atlantic University
Description
Data compression in computer based data files has
just begun to be used, with the major emphasis placed on
character string suppression. Within this paper, character
string suppression, Huffman encoding, noun-vector, and
dictionary-vector compression methods are reviewed and
compared as well as several combinations of these methods.
The methods investigated were compared against three
typical data library types: 1) program source data files,
2) test case data files, and 3) text data files.
Compression percentage, speed of compression and decompression,
storage requirements, error recovery, and data
security comparison of the various methods are also
presented.
Model
Digital Document
Publisher
Florida Atlantic University
Description
This research is concerned with algorithmic representation of technoeconomic growth concerning modern and next-generation telecommunications including the Internet service. The goal of this study thereof is to emphasize efforts to establish the associated forecasting and, the envisioned tasks thereof include : (i) Reviewing the technoeconomic considerations prevailing in telecommunication (telco) service industry and their implicating features; (ii) studying relevant aspects of underlying complex system evolution (akin to biological systems), (iii) pursuant co-evolution modeling of competitive business structures using dichotomous (flip-flop) states as seen in predator evolutions ; (iv) conceiving a novel algorithm based on information-theoretic principles toward technoeconomic forecasting on the basis of modified Fisher-Kaysen model consistent with proportional fairness concept of comsumers' willingness-to-pay, and (v) evaluating forecast needs on inter-office facility based congestion sensitive traffics encountered. Commensurate with the topics indicated above, necessary algorithms, analytical derivations and compatible models are proposed. Relevant computational exercises are performed with MatLab[TM] using data gathered from open-literature on the service profiles of telecommunication companies (telco); and ad hoc model verifications are performed on the results. Lastly, discussions and inferences are made with open-questions identified for further research.