Algorithms

Model
Digital Document
Publisher
Florida Atlantic University
Description
Liposuction is a common invasive procedure. Liposuction is performed for cosmetic and non-cosmetic reasons. Its use in regenerative medicine has been increasing. Its invasive nature renders it to have complications which can cause limitations in patients' recovery and patient lives. This thesis’s aim is to create an analytical framework to assess the liposuction procedure and its outcomes. The fundamental requirement to create this framework is to have a complete understanding of the procedure which includes preparation and planning of the procedure, correctly performing the procedure and ensuring patient safety on day 0, week 2, week 4, and week 12 of the procedure. 54 patient’s liposuction outcomes were followed till week 12. Data collection is the first part of the framework, which involves understanding the complex surgical outcomes. Algorithms that have been previously studied to assess morbidities and mortalities have been used in this framework to assess if they can assess liposuction outcomes. In this framework algorithms like decision tree, XG boost, random forest, support vector classifier, k nearest neighbor, k means, k fold validation have been used. XG boost performed best to assess liposuction outcomes without validation. However, after cross validation other algorithms which are random forest, support vector machine and KNN classifier outperformed XG boost. This framework allows to assess liposuction outcomes based on the performance of the algorithms. In future, researchers can use this framework to assess liposuction as well as other surgical outcome.
Model
Digital Document
Publisher
Florida Atlantic University
Description
Pond aquaculture accounts 65% of global finfish production. A major factor limiting pond aquaculture productivity is fluctuating oxygen levels, which are heavily influenced by atmospheric conditions and primary productivity. Being able to predict DO concentrations by measuring environmental parameters would be beneficial to improving the industry’s efficiencies. The data collected included pond DO, water temperature, air temperature, atmospheric pressure, wind speed/direction, solar irradiance, rainfall, pond Chl-a concentrations as well as water color images. Pearson’s correlations and stepwise regressions were used to determine the variables’ connection to DO and their potential usefulness for a prediction model. It was determined that sunlight levels play a crucial role in DO fluctuations and crashes because of its influence on pond heating, primary productivity, and pond stratification. It was also found that image data did have correlations to certain weather variables and helped improve prediction strength.
Model
Digital Document
Publisher
Florida Atlantic University
Description
Application-layer based attacks are becoming a more desirable target in computer networks for hackers. From complex rootkits to Denial of Service (DoS) attacks, hackers look to compromise computer networks. Web and application servers can get shut down by various application-layer DoS attacks, which exhaust CPU or memory resources. The HTTP protocol has become a popular target to launch application-layer DoS attacks. These exploits consume less bandwidth than traditional DoS attacks. Furthermore, this type of DoS attack is hard to detect because its network traffic resembles legitimate network requests. Being able to detect these DoS attacks effectively is a critical component of any robust cybersecurity system. Machine learning can help detect DoS attacks by identifying patterns in network traffic. With machine learning methods, predictive models can automatically detect network threats.
This dissertation offers a novel framework for collecting several attack datasets on a live production network, where producing quality representative data is a requirement. Our approach builds datasets from collected Netflow and Full Packet Capture (FPC) data. We evaluate a wide range of machine learning classifiers which allows us to analyze slow DoS detection models more thoroughly. To identify attacks, we look at each dataset's unique traffic patterns and distinguishing properties. This research evaluates and investigates appropriate feature selection evaluators and search strategies. Features are assessed for their predictive value and degree of redundancy to build a subset of features. Feature subsets with high-class correlation but low intercorrelation are favored. Experimental results indicate Netflow and FPC features are discriminating enough to detect DoS attacks accurately. We conduct a comparative examination of performance metrics to determine the capability of several machine learning classifiers. Additionally, we improve upon our performance scores by investigating a variety of feature selection optimization strategies. Overall, this dissertation proposes a novel machine learning approach for detecting slow DoS attacks. Our machine learning results demonstrate that a single subset of features trained on Netflow data can effectively detect slow application-layer DoS attacks.
Model
Digital Document
Publisher
Florida Atlantic University
Description
The Internet has provided humanity with many great benefits, but it has also introduced new risks and dangers. E-commerce and other web portals have become large industries with big data. Criminals and other bad actors constantly seek to exploit these web properties through web attacks. Being able to properly detect these web attacks is a crucial component in the overall cybersecurity landscape. Machine learning is one tool that can assist in detecting web attacks. However, properly using machine learning to detect web attacks does not come without its challenges. Classification algorithms can have difficulty with severe levels of class imbalance. Class imbalance occurs when one class label disproportionately outnumbers another class label. For example, in cybersecurity, it is common for the negative (normal) label to severely outnumber the positive (attack) label. Another difficulty encountered in machine learning is models can be complex, thus making it difficult for even subject matter experts to truly understand a model’s detection process. Moreover, it is important for practitioners to determine which input features to include or exclude in their models for optimal detection performance. This dissertation studies machine learning algorithms in detecting web attacks with big data. Severe class imbalance is a common problem in cybersecurity, and mainstream machine learning research does not sufficiently consider this with web attacks. Our research first investigates the problems associated with severe class imbalance and rarity. Rarity is an extreme form of class imbalance where the positive class suffers extremely low positive class count, thus making it difficult for the classifiers to discriminate. In reducing imbalance, we demonstrate random undersampling can effectively mitigate the class imbalance and rarity problems associated with web attacks. Furthermore, our research introduces a novel feature popularity technique which produces easier to understand models by only including the fewer, most popular features. Feature popularity granted us new insights into the web attack detection process, even though we had already intensely studied it. Even so, we proceed cautiously in selecting the best input features, as we determined that the “most important” Destination Port feature might be contaminated by lopsided traffic distributions.
Model
Digital Document
Publisher
Florida Atlantic University
Description
One of the most common types of cancer among women is breast cancer. It represents one of the diseases leading to a high number of mortalities among women. On the other hand, prostate cancer is the second most frequent malignancy in men worldwide.
The early detection of prostate cancer is fundamental to reduce mortality and increase the survival rate. A comparison between six types of machine learning models as Logistic Regression, Decision Tree, Random Forest, Gradient Boosting, k Nearest Neighbors, and Naïve Bayes has been performed. This research aims to identify the most efficient machine learning algorithms for identifying the most significant risk factors of prostate and breast cancers. For this reason, National Health Interview Survey (NHIS) and Prostate, Lung, Colorectal, and Ovarian (PLCO) datasets are used. A comprehensive comparison of risk factors leading to these two crucial cancers can significantly impact early detection and progressive improvement in survival.
Model
Digital Document
Publisher
Florida Atlantic University
Description
Unmanned Aircraft Systems (UAS) have grown in popularity due to their widespread potential applications, including efficient package delivery, monitoring, surveillance, search and rescue operations, agricultural uses, along with many others. As UAS become more integrated into our society and airspace, it is anticipated that the development and maintenance of a path planning collision-free system will become imperative, as the safety and efficiency of the airspace represents a priority. The dissertation defines this problem as the UAS Collision-free Path Planning Problem.
The overall objective of the dissertation is to design an on-demand, efficient and scalable aerial highway path planning system for UAS. The dissertation explores two solutions to this problem. The first solution proposes a space-time algorithm that searches for shortest paths in a space-time graph. The solution maps the aerial traffic map to a space-time graph that is discretized on the inter-vehicle safety distance. This helps compute safe trajectories by design. The mechanism uses space-time edge pruning to maintain the dynamic availability of edges as vehicles move on a trajectory. Pruning edges is critical to protect active UAS from collisions and safety hazards. The dissertation compares the solution with another related work to evaluate improvements in delay, run time scalability, and admission success while observing up to 9000 flight requests in the network. The second solution to the path planning problem uses a batch planning algorithm. This is a new mechanism that processes a batch of flight requests with prioritization on the current slack time. This approach aims to improve the planning success ratio. The batch planning algorithm is compared with the space-time algorithm to ascertain improvements in admission ratio, delay ratio, and running time, in scenarios with up to 10000 flight requests.
Model
Digital Document
Publisher
Florida Atlantic University
Description
An adversary armed with a quantum computer has algorithms[66, 33, 34] at their disposal, which are capable of breaking our current methods of encryption. Even with the birth of post-quantum cryptography[52, 62, 61], some of best cryptanalytic algorithms are still quantum [45, 8]. This thesis contains several experiments on the efficacy of lattice reduction algorithms, BKZ and LLL. In particular, the difficulty of solving Learning With Errors is assessed by reducing the problem to an instance of the Unique Shortest Vector Problem. The results are used to predict the behavior these algorithms may have on actual cryptographic schemes with security based on hard lattice problems. Lattice reduction algorithms require several floating-point operations including multiplication. In this thesis, I consider the resource requirements of a quantum circuit designed to simulate floating-point multiplication with high precision.
Model
Digital Document
Publisher
Florida Atlantic University
Description
Numerous examples arise in fields ranging from mechanics to biology where disappearance of Chaos can be detrimental. Preventing such transient nature of chaos has been proven to be quite challenging. The utility of Reinforcement Learning (RL), which is a specific class of machine learning techniques, in discovering effective control mechanisms in this regard is shown. The autonomous control algorithm is able to prevent the disappearance of chaos in the Lorenz system exhibiting meta-stable chaos, without requiring any a-priori knowledge about the underlying dynamics. The autonomous decisions taken by the RL algorithm are analyzed to understand how the system’s dynamics are impacted. Learning from this analysis, a simple control-law capable of restoring chaotic behavior is formulated. The reverse-engineering approach adopted in this work underlines the immense potential of the techniques used here to discover effective control strategies in complex dynamical systems. The autonomous nature of the learning algorithm makes it applicable to a diverse variety of non-linear systems, and highlights the potential of RLenabled control for regulating other transient-chaos like catastrophic events.
Model
Digital Document
Publisher
Florida Atlantic University
Description
Automatic target recognition of unexploded ordnances in side scan sonar imagery has been a struggling task, due to the lack of publicly available side-scan sonar data. Real time image detection and classification algorithms have been implemented to combat this task, however, machine learning algorithms require a substantial amount of training data to properly detect specific targets. Transfer learning methods are used to replace the need of large datasets, by using a pre trained network on the side-scan sonar images. In the present study the implementation of a generative adversarial network is used to generate meaningful sonar imagery from a small dataset. The generated images are then added to the existing dataset to train an image detection and classification algorithm. The study looks to demonstrate that generative images can be used to aid in detecting objects of interest in side-scan sonar imagery.
Model
Digital Document
Publisher
Florida Atlantic University
Description
The integrity of network communications is constantly being challenged by more sophisticated intrusion techniques. Attackers are shifting to stealthier and more complex forms of attacks in an attempt to bypass known mitigation strategies. Also, many detection methods for popular network attacks have been developed using outdated or non-representative attack data. To effectively develop modern detection methodologies, there exists a need to acquire data that can fully encompass the behaviors of persistent and emerging threats. When collecting modern day network traffic for intrusion detection, substantial amounts of traffic can be collected, much of which consists of relatively few attack instances as compared to normal traffic. This skewed distribution between normal and attack data can lead to high levels of class imbalance. Machine learning techniques can be used to aid in attack detection, but large levels of imbalance between normal (majority) and attack (minority) instances can lead to inaccurate detection results.