Information theory

Model
Digital Document
Publisher
Florida Atlantic University
Description
As quantum computers continue to develop, they pose a threat to cryptography since many popular cryptosystems will be rendered vulnerable. This is because the security of most currently used asymmetric systems requires the computational hardness of the integer factorization problem, the discrete logarithm or the elliptic curve discrete logarithm problem. However, there are still some cryptosystems that resist quantum computing. We will look at code-based cryptography in general and the McEliece cryptosystem specifically. Our goal is to understand the structure behind the McEliece scheme, including the encryption and decryption processes, and what some advantages and disadvantages are that the system has to offer. In addition, using the results from Courtois, Finiasz, and Sendrier's paper in 2001, we will discuss a digital signature scheme based on the McEliece cryptosystem. We analyze one classical algebraic attack against the security analysis of the system based on the distinguishing problem whether the public key of the McEliece scheme is generated from a generating matrix of a binary Goppa code or a random binary matrix. The idea of the attack involves solving an algebraic system of equations and we examine the dimension of the solution space of the linearized system of equations. With the assistance from a paper in 2010 by Faugere, Gauthier-Umana, Otmani, Perret, Tillich, we will see the parameters needed for the intractability of the distinguishing problem.
Model
Digital Document
Publisher
Florida Atlantic University
Description
Graphs are often used to depict an abstraction of software. A graph may be an abstraction of a software system and a subgraph may represent a software module. Coupling and cohesion are attributes that summarize the degree of interdependence or connectivity among subsystems or within subsystems, respectively. When used in conjunction with measures of other attributes, coupling and cohesion can contribute to an assessment or prediction of software quality. Information theory is attractive to us because the design decisions embodied by the graph are information. Using information theory, we propose measures of the cohesion and coupling of a modular system and cohesion and coupling of each constituent module. These measures conform to the properties of cohesion and coupling defined by Briand, Morasca and Basili, applied to undirected graphs and therefore, are in the families of measures called cohesion and coupling.
Model
Digital Document
Publisher
Florida Atlantic University
Description
This thesis addresses a method to deduce the statistical bounds associated with the cell-transfer delay variations (CDVs) encountered by the cells of MPEG traffic, transmitted in the Asynchronous Transfer Mode (ATM) networks. This study focuses on: (1) Estimating CDV arising from multiplexing/switching for both constant bit rate (CBR) and variable bit rate (VBR) traffics via priority allocation based simulations. (2) Developing an information-theoretics based technique to get an insight of the combined BER-induced and multiplexing/switching-induced CDVs in ATM networks. Algorithms pertinent to CDV statistics are derived and the lower and upper bounds of the statistics are obtained via simulations in respect of CBR and VBR traffics. Ascertaining these bounds is useful in the cell admission control (CAC) strategies adopted in ATM transmissions. Inferential remarks indicating the effects of traffic parameters (such as bandwidth, burstiness etc.) on the values of the statistical bounds are presented, and scope for further work are presented.
Model
Digital Document
Publisher
Florida Atlantic University
Description
This thesis examines the relationship between the rational expectations flexible-price macroeconomic model and the pure theory of information originally developed within the electronic communications disciplines. The thesis first develops the rational expectations macromodel, including discussions of the model's assumptions, robustness, econometric issues, and important empirical results. The pure theory of information is then developed, and the "polar cases" of full information and complete information deprivation are examined. Finally, a generalized model of information acquisition and "transmission noise" are developed within the rational expectations framework.
Model
Digital Document
Publisher
Florida Atlantic University
Description
The objective of this research is to determine the macroscopic behavior of packet transit-times across the global Internet cloud using an artificial neural network (ANN). Specifically, the problem addressed here refers to using a "fast-convergent" ANN for the purpose indicated. The underlying principle of fast-convergence is that, the data presented in training and prediction modes of the ANN is in the entropy (information-theoretic) domain, and the associated annealing process is "tuned" to adopt only the useful information content and discard the posentropy part of the data presented. To demonstrate the efficacy of the research pursued, a feedforward ANN structure is developed and the necessary transformations required to convert the input data from the parametric-domain to the entropy-domain (and a corresponding inverse transformation) are followed so as to retrieve the output in parametric-domain. The fast-convergent or fast-computing ANN (FC-ANN) developed is deployed to predict the packet-transit performance across the Internet. (Abstract shortened by UMI.)
Model
Digital Document
Publisher
Florida Atlantic University
Description
The research proposed and elaborated in this dissertation is concerned with the development of new and smart techniques for subchannel allocation in the asymmetric digital subscriber lines (ADSLs). The ADSL refers to a class of access technology adopted currently in modern telecommunications to make use of the available channel capacity on the twisted copper-wires, which exist in the "last-mile" between the central office and subscribers. This available spectrum on the voice grade copper-lines is judiciously used to transport broadband data over the last mile regime. For this purpose, the channel capacity on the access lines is segmented in subchannels and the traffic to be transported is placed on the subchannels matching the bit-rates of the traffic to the subchannel capacity (as dictated by Hartley-Shannon law). The available subchannels for downstream and upstreams are of different extents (640 kbps for upstream and 9 Mbps for downstream); and, hence are qualified as asymmetric transports. Relevant to the subchannel allocation as above, the specific research, carried out can be enumerated as follows: (1) Development of a subchannel allocation metric (SAM) on the basis of information-theoretic considerations and duly accounting for noise/interference effects on the access lines and BER-based information-impairments on the trunks (feeding the access lines); (2) Use of SAM as an algorithmic support to train an artificial neural network (ANN), which is facilitated at the ADSL modem performing subchannel allocation. A new version of ANN training (and subchannel allocation prediction) strategies is developed by implementing the ANN operation in the entropy-plane. This technique allows a fast convergence of the ANN compatible for telecommunication transports. The incorporation of ANN in the modem renders the subchannel allocation smart; (3) Fuzzy considerations are also included in the ANN indicated above and operation of ADSL modem is then tuned to function as an intelligent neuro inference engine in its efforts towards subchannel allocation; (4) ATM support on ADSL lines is investigated and a scheme for allocating the permanent and switched virtual circuits (supporting ATM specified traffic) on the subchannels of access lines is developed. Relevant call-blocking probabilities are assessed; (5) Lastly, the EMI/RFI, and crosstalks on access lines are studied in the framework of practical considerations and mitigatory efforts are suggested thereof. Simulated results using data commensurate with practical aspects of ADSL transport are furnished and discussed. Background literature is comprehensively presented chapterwise and scope for future work is identified via open questions in the concluding chapter.
Model
Digital Document
Publisher
Florida Atlantic University
Description
The research proposed and elaborated in this dissertation is concerned with the development of new decision algorithms for hard handoff strategies in mobile communication systems. Specifically, the research tasks envisaged include the following: (1) Use of information-theoretics based statistical distance measures as a metric for hard handoff decisions; (2) A study to evaluate the log-likelihood criterion towards decision considerations to perform the hard handoff; (3) Development of a statistical model to evaluate optimum instants of measurements of the metric used for hard handoff decision. The aforesaid objectives refer to a practical scenario in which a mobile station (MS) traveling away from a serving base station (BS-I) may suffer communications impairment due to interference and shadowing affects, especially in an urban environment. As a result, it will seek to switch over to another base station (BS-II) that facilitates a stronger signal level. This is called handoff procedure. (The hard handoff refers to the specific case in which only one base station serves the mobile at the instant of handover). Classically, the handoff decision is done on the basis of the difference between received signal strengths (RSS) from BS-I and BS-II. The algorithms developed here, in contrast, stipulate the decision criterion set by the statistical divergence and/or log-likelihood ratio that exists between the received signals. The purpose of the present study is to evaluate the relative efficacy of the conventional and proposed algorithms in reference to: (i) Minimization of unnecessary handoffs ("ping-pongs"); (ii) Minimization of delay in handing over; (iii) Ease of implementation and (iv) Minimization of possible call dropouts due to ineffective handover envisaged. Simulated results with data commensurate with practical considerations are furnished and discussed. Background literature is presented in the introductory chapter and scope for future work is identified via open questions in the concluding chapter.
Model
Digital Document
Publisher
Florida Atlantic University
Description
Development of reliable, high quality, software requires study and understanding at each step of the development process. A basic assumption in the field of software measurement is that metrics of internal software attributes somehow relate to the intrinsic difficulty in understanding a program. Measuring the information content of a program attempts to indirectly quantify the comprehension task. Information theory based software metrics are attractive because they quantify the amount of information in a well defined framework. However, most information theory based metrics have been proposed with little reference to measurement theory fundamentals, and empirical validation of predictive quality models has been lacking. This dissertation proves that representative information theory based software metrics can be "meaningful" components of software quality models in the context of measurement theory. To this end, members of a major class of metrics are shown to be regular representations of Minimum Description Length or Variety of software attributes, and are interval scale. An empirical validation case study is presented that predicted faults in modules based on Operator Information. This metric is closely related to Harrison's Average Information Content Classification, which is the entropy of the operators. New general methods for calculating synthetic complexity at the system level and module level are presented, quantifying the joint information of an arbitrary set of primitive software measures. Since all kinds of information are not equally relevant to software quality factors, components of synthetic module complexity are also defined. Empirical case studies illustrate the potential usefulness of the proposed synthetic metrics. A metrics data base is often the key to a successful ongoing software metrics program. The contribution of any proposed metric is defined in terms of measured variation using information theory, irrespective of the metric's usefulness in quality models. This is of interest when full validation is not practical. Case studies illustrate the method.