Data processing

Model
Digital Document
Publisher
Florida Atlantic University
Description
Implementing Shamir's secret sharing scheme using floating point arithmetic would provide a faster and more efficient secret sharing scheme due to the speed in which GPUs perform floating point arithmetic. However, with the loss of a finite field, properties of a perfect secret sharing scheme are not immediately attainable. The goal is to analyze the plausibility of Shamir's secret sharing scheme using floating point arithmetic achieving the properties of a perfect secret sharing scheme and propose improvements to attain these properties. Experiments indicate that property 2 of a perfect secret sharing scheme, "Any k-1 or fewer participants obtain no information regarding the shared secret", is compromised when Shamir's secret sharing scheme is implemented with floating point arithmetic. These experimental results also provide information regarding possible solutions and adjustments. One of which being, selecting randomly generated points from a smaller interval in one of the proposed schemes of this thesis. Further experimental results indicate improvement using the scheme outlined. Possible attacks are run to test the desirable properties of the different schemes and reinforce the improvements observed in prior experiments.
Model
Digital Document
Publisher
Florida Atlantic University
Description
As computing technology continues to advance, it has become increasingly difficult to find businesses that do not rely, at least in part, upon the collection and analysis of data for the purpose of project management and process improvement. The cost of software tends to increase over time due to its complexity and the cost of employing humans to develop, maintain, and evolve it. To help control the costs, organizations often seek to improve the process by which software systems are developed and evolved. Improvements can be realized by discovering previously unknown or hidden relationships between the artifacts generated as a result of developing a software system. The objective of the work described in this thesis is to provide a visualization tool that helps managers and engineers better plan for future projects by discovering new knowledge gained by synthesizing and visualizing data mined from software repository records from previous projects.
Model
Digital Document
Publisher
Florida Atlantic University
Description
The efforts addressed in this thesis refer to assaying the extent of local features in 2D-images for the purpose of recognition and classification. It is based on comparing a test-image against a template in binary format. It is a bioinformatics-inspired approach pursued and presented as deliverables of this thesis as summarized below: 1. By applying the so-called 'Smith-Waterman (SW) local alignment' and 'Needleman-Wunsch (NW) global alignment' approaches of bioinformatics, a test 2D-image in binary format is compared against a reference image so as to recognize the differential features that reside locally in the images being compared 2. SW and NW algorithms based binary comparison involves conversion of one-dimensional sequence alignment procedure (indicated traditionally for molecular sequence comparison adopted in bioinformatics) to 2D-image matrix 3. Relevant algorithms specific to computations are implemented as MatLabTM codes 4. Test-images considered are: Real-world bio-/medical-images, synthetic images, microarrays, biometric finger prints (thumb-impressions) and handwritten signatures. Based on the results, conclusions are enumerated and inferences are made with directions for future studies.
Model
Digital Document
Publisher
Florida Atlantic University
Description
MicroRNAs (miRNAs) may serve as diagnostic and predictive biomarkers for cancer. The aim of this study was to identify novel cancer biomarkers from miRNA datasets, in addition to those already known. Three published miRNA cancer datasets (liver, breast, and brain) were evaluated, and the performance of the entire feature set was compared to the performance of individual feature filters, an ensemble of those filters, and a support vector machine (SVM) wrapper. In addition to confirming many known biomarkers, the main contribution of this study is that seven miRNAs have been newly identified by our ensemble methodology as possible important biomarkers for hepatocellular carcinoma or breast cancer, pending wet lab confirmation. These biomarkers were identified from miRNA expression datasets by combining multiple feature selection techniques (i.e., creating an ensemble) or by the SVM-wrapper, and then classified by different learners. Generally speaking, creating a subset of features by selecting only the highest ranking features (miRNAs) improved upon results generated when using all the miRNAs, and the ensemble and SVM-wrapper approaches outperformed individual feature selection methods. Finally, an algorithm to determine the number of top-ranked features to include in the creation of feature subsets was developed. This algorithm takes into account the performance improvement gained by adding additional features compared to the cost of adding those features.
Model
Digital Document
Publisher
Florida Atlantic University
Description
The efforts addressed in this thesis refer to applying nonlinear risk predictive techniques based on logistic regression to medical diagnostic test data. This study is motivated and pursued to address the following: 1. To extend logistic regression model of biostatistics to medical informatics 2. Computational preemptive and predictive testing to determine the probability of occurrence (p) of an event by fitting a data set to a (logit function) logistic curve: Finding upper and lower bounds on p based on stochastical considerations 3. Using the model developed on available (clinical) data to illustrate the bounds-limited performance of the prediction. Relevant analytical methods, computational efforts and simulated results are presented. Using the results compiled, the risk evaluation in medical diagnostics is discussed with real-world examples. Conclusions are enumerated and inferences are made with directions for future studies.
Model
Digital Document
Publisher
Florida Atlantic University
Description
Wireless devices in wireless networks are powered typically by small batteries that are not replaceable nor recharged in a convenient way. To prolong the operating lifetime of networks, energy efficiency is indicated as a critical issue and energy-efficient resource allocation designs have been extensively developed. We investigated energy-efficient schemes that prolong network operating lifetime in wireless sensor networks and in wireless relay networks. In Chapter 2, the energy-efficient resource allocation that minimizes a general cost function of average user powers for small- or medium-scale wireless sensor networks, where the simple time-division multiple-access (TDMA) is adopted as the multiple access scheme. A class of Ç-fair cost-functions is derived to balance the tradeoff between efficiency and fairness in energy-efficient designs. Based on such cost functions, optimal channel-adaptive resource allocation schemes are developed for both single-hop and multi-hop TDMA sensor networks. In Chapter 3, optimal power control methods to balance the tradeoff between energy efficiency and fairness for wireless cooperative networks are developed. It is important to maximize power efficiency by minimizing power consumption for a given quality of service, such as the data rate; it is also equally important to evenly or fairly distribute power consumption to all nodes to maximize the network life. The optimal power control policy proposed is derived in a quasi-closed form by solving a convex optimization problem with a properly chosen cost-function. To further optimize a wireless relay network performance, an orthogonal frequency division multiplexing (OFDM) based multi-user wireless relay network is considered in Chapter 4.
Model
Digital Document
Publisher
Florida Atlantic University
Description
The aim of this work is to investigate a security model in which we allow an adversary to have access to functions of the secret key. In recent years, significant progress has been made in understanding the security of encryption schemes in the presence of key-dependent plaintexts or messages (known as KDM). Here, we motivate and explore the security of a setting, where an adversary against a message authentication code (MAC) or signature scheme can access signatures on key-dependent messages. We propose a way to formalize the security of message authentication schemes in the presence of key-dependent MACs (KD-EUF) and of signature schemes in the presence of key-dependent signatures (KDS). An attack on a message recognition protocol involving a MAC is presented. It turns out that the situation is quite different from key-dependent encryption: To achieve KD-EUF-security or KDS-security under non-adaptive chosen message attacks, the use of a stateful signing algorithm is inevitable even in the random oracle model. After discussing the connection between key-dependent signing and forward security, we describe a compiler which lifts any EUF-CMA secure one-time signature scheme to a forward secure signature scheme offering KDS-CMA security. Then, we discuss how aggregate signatures can be used to combine the signatures in the certificate chain used in the compiler. A natural question arises about how to combine the security definitions of KDM and KDS to come up with a signcryption scheme that is secure. We also offer a connection with Leakage-Resilient Signatures, which take into account side-channel attacks. Lastly, we present some open problems for future research.
Model
Digital Document
Publisher
Florida Atlantic University
Description
In order to improve the quality of care, there is urgent need to involve patients in their own healthcare. So to make patient centered health care system Personal Health Records are proposed as viable solution. This research discusses the importance of a Patient Centric Health Record system. Such systems can empower patients to participate in improving health care quality. It would also provide an economically viable solution to the need for better healthcare without escalating costs by avoiding duplication. The proposed system is Web-based; therefore it has high accessibility and availability. The cloud computing based architecture is used which will allow consumers to address the challenge of sharing medical data. PHR would provide a complete and accurate summary of the health and medical history of an individual by gathering data from many sources. This would make information accessible online to anyone who has the necessary electronic credentials to view the information.
Model
Digital Document
Publisher
Florida Atlantic University
Description
A Shock wave as represented by the Riemann problem and a Point-blast explosion are two key phenomena involved in a supernova explosion. Any hydrocode used to simulate supernovae should be subjected to tests consisting of the Riemann problem and the Point-blast explosion. L. I. Sedov's solution of Point-blast explosion and Gary A. Sod's solution of a Riemann problem have been re-derived here from one dimensional fluid dynamics equations . Both these problems have been solved by using the idea of Self-similarity and Dimensional analysis. The main focus of my research was to subject the CHIMERA supernova code to these two hydrodynamic tests. Results of CHIMERA code for both the blast wave and Riemann problem have then been tested by comparing with the results of the analytic solution.
Model
Digital Document
Publisher
Florida Atlantic University
Description
For years attribution research has been dominated by the ANOVA model of behavior which proposes that people construct their dispositional attributions of others by carefully comparing and weighing all situational information using mental computations similar to the processes used by researchers to analyze data. A preliminary experiment successfully determined that participants were able to distinguish differences in variability assessed across persons (high vs. low consensus) and across situations (high vs. low distinctiveness). Also, it was clear that the subjects could evaluate varying levels of situational constraint. A primary experiment administered to participants immediately following the preliminary study determined that participants grossly under-utilized those same variables when making dispositional attributions. Results gave evidence against the use of traditional ANOVA models and support for the use of the Behavior Averaging Principle of Attribution.