Department of Mathematical Sciences

Related Entities
Model
Digital Document
Publisher
Florida Atlantic University
Description
A common topological data analysis approach used in the experimental sciences involves creating machine learning pipelines that incorporate discriminating topological features derived from persistent homology (PH) of data samples, encoded in persistence diagrams (PDs) and associated topological feature vectors. Often the most computationally demanding step is computing PH through an algorithmic process known as boundary matrix reduction. In this work, we introduce several methods to generate topological feature vectors from unreduced boundary matrices. We compared the performance of classifiers trained on vectorizations of unreduced PDs to vectorizations of fully-reduced PDs across several benchmark ML datasets. We discovered that models trained on PDs built from unreduced diagrams can perform on par and even outperform those trained on full-reduced diagrams. This observation suggests that machine learning pipelines which incorporate topology-based features may benefit in terms of computational cost and performance by utilizing information contained in unreduced boundary matrices.
Model
Digital Document
Publisher
Florida Atlantic University
Description
The security of the current public-key cryptographic schemes, based on integer factorization and discrete logarithm problems, is expected to be totally broken with the development of quantum computers utilizing Shor’s algorithm. As a result, The National Institute of Standards and Technology (NIST) initiated the Post-Quantum Cryptography (PQC) standardization process in 2016, inviting researchers to submit candidate algorithms that are both resistant to quantum attacks and efficient for real world applications. Researchers have since studied various aspects of the candidate algorithms, such as their security against quantum attacks and efficient implementation on different platforms.
In this thesis, we investigate the practical aspects of Post-Quantum Cryptography and contribute to several topics. First, we focus on the knapsack problem and its security under classical and quantum attacks. Second, we improve the secure biometric template generation algorithm NTT-Sec, proposing an enhanced version, NTT-Sec-R, and providing an in-depth design and security analysis. Third, we work on optimizing implementations of the post-quantum secure signature scheme LESS and polynomial inversion algorithms for code-based schemes. Finally, we analyze a proposed countermeasure for the exposure model of SIKE, the isogeny-based scheme that is a candidate in NIST’s Round 4.
Model
Digital Document
Publisher
Florida Atlantic University
Description
We explore a novel method of approximating contractible invariant circles in maps. The process begins by leveraging improvements on Birkhoff's Ergodic Theorem via Weighted Birkhoff Averages to compute high precision estimates on several Fourier modes. We then set up a Newton-like iteration scheme to further improve the estimation and extend the approximation out to a sufficient number of modes to yield a significant decay in the magnitude of the coefficients of high order. With this approximation in hand, we explore the phase space near the approximate invariant circle with a form numerical continuation where the rotation number is perturbed and the process is repeated. Then, we turn our attention to a completely different problem which can be approached in a similar way to the numerical continuation, finding a Siegel disk boundary in a holomorphic map. Given a holomorphic map which leads to a formally solvable cohomological equation near the origin, we use a numerical continuation style process to approximate an invariant circle in the Siegel disk near the origin. Using an iterative scheme, we then enlarge the invariant circle so that it approximates the boundary of the Siegel disk.
Model
Digital Document
Publisher
Florida Atlantic University
Description
An efficient scalar multiplication algorithm is vital for elliptic curve cryptosystems. The first part of this dissertation focuses on a scalar multiplication algorithm based on scalar recodings resistant to timing attacks. The algorithm utilizes two recoding methods: Recode, which generalizes the non-zero signed all-bit set recoding, and Align, which generalizes the sign aligned columns recoding. For an ℓ-bit scalar split into d subscalars, our algorithm has a computational cost of ⌈⌈ℓ logk(2)⌉/d⌉ point additions and k-scalar multiplications and a storage cost of kd−1(k − 1) – 1 points on E. The “split and comb” method further optimizes computational and storage complexity. We find the best setting to be with a fixed base point on a Twisted Edwards curve using a mix of projective and extended coordinates, with k = 2 generally offering the best performance. However, k = 3 may be better in certain applications. The second part of this dissertation is dedicated to constant-time polynomial inversion algorithms in Post-Quantum Cryptography (PQC). The computation of the inverse of a polynomial over a quotient ring or finite field is crucial for key generation in post-quantum cryptosystems like NTRU, BIKE, and LEDACrypt. Efficient algorithms must run in constant time to prevent side-channel attacks. We examine constant-time algorithms based on Fermat’s Little Theorem and the Extended GCD Algorithm, providing detailed time complexity analysis. We find that the constant-time Extended GCD inversion algorithm is more efficient, performing fewer field multiplications. Additionally, we explore other exponentiation algorithms similar to the Itoh-Tsuji inversion method, which optimizes polynomial multiplications in the BIKE/LEDACrypt setup. Recent results on hardware implementations are also discussed.
Model
Digital Document
Publisher
Florida Atlantic University
Description
Computational tools grounded in algebraic topology, known collectively as topological data analysis (TDA), have been used for dimensionality-reduction to preserve salient and discriminating features in data. This faithful but compressed representation of data through TDA’s flagship method, persistent homology (PH), motivates its use to address the complexity, depth, and inefficiency issues present in privacy-preserving, homomorphic encryption (HE)-based machine learning (ML) models, which permit a data provider (often referred to as the Client) to outsource computational tasks on their encrypted data to a computationally-superior but semi-honest party (the Server). This work introduces efforts to adapt the well-established TDA-ML pipeline on encrypted data to realize the benefits TDA can provide to HE’s computational limitations as well as provide HE’s provable security on the sensitive data domains in which TDA has found success in (e.g., sequence, gene expression, imaging). The privacy-protecting technologies which could emerge from this foundational work will lead to direct improvements to the accessibility and equitability of health care systems. ML promises to reduce biases and improve accuracies of diagnoses, and enabling such models to act on sensitive biomedical data without exposing it will improve trustworthiness of these systems.
Model
Digital Document
Publisher
Florida Atlantic University
Description
Euclidean lattices have attracted considerable research interest as they can be used to construct efficient cryptographic schemes that are believed to be quantum-resistant. The NTRU problem, introduced by J. Hoffstein, J. Pipher, and J. H. Silverman in 1996 [16], serves as an important average-case computational problem in lattice-based cryptography. Following their pioneer work, the NTRU assumption and its variants have been used widely in modern cryptographic constructions such as encryption, signature, etc.
Let Rq = Zq[x]/ (xn + 1) be a quotient polynomial ring. The standard NTRU problem asks to recover short polynomials f, g E Rq such that h - g/ f (mod q), given a public key h and the promise that such elements exist. In practice, the degree n is often a power of two. As a generalization of NTRU, the Module-NTRU problems were introduced by Cheon, Kim, Kim, and Son (IACR ePrint 2019/1468), and Chuengsatiansup, Prest, Stehle, Wallet, and Xagawa (ASIACCS '20).
In this thesis, we presented two post-quantum Digital Signature Schemes based on the Module-NTRU problem and its variants.
Model
Digital Document
Publisher
Florida Atlantic University
Description
The following dissertation investigates algebraic frames and their spaces of minimal prime elements with respect to the Hull-Kernel topology and Inverse topology. Much work by other authors has been done in obtaining internal characterizations in frame-theoretic terms for when these spaces satisfy certain topological properties, but most of what is done is under the auspices of the finite intersection property. In the first half of this dissertation, we shall add to the literature more characterizations in this context, and in the second half we will study general algebraic frames and investigate which, if any, of the known theorems generalize to algebraic frames not necessarily with the FIP.
Throughout this investigative journey, we have found that certain ideals and filters of algebraic frames play a pivotal role in determining internal characterizations of the algebraic frames for when interesting topological properties occur in its space of minimal prime elements. In this dissertation, we investigate completely prime filters and compactly generated filters on algebraic frames. We introduce a new concept of subcompact elements and subcompactly generated filters. One of our main results is that the inverse topology on the space of minimal prime elements is compact if and only if every maximal subcompactly generated filter is completely prime. Furthermore, when the space of minimal prime elements is compact, then each minimal prime has what we are calling the compact absoluteness property.
Model
Digital Document
Publisher
Florida Atlantic University
Description
Topological Data Analysis (TDA) is a relatively new field of research that utilizes topological notions to extract discriminating features from data. Within TDA, persistent homology (PH) is a robust method to compute multi-dimensional geometric and topological features of a dataset. Because these features are often stable under certain perturbations of the underlying data, are often discriminating, and can be used for visualization of structure in high-dimensional data and in statistical and machine learning modeling, PH has attracted the interest of researchers across scientific disciplines and in many industry applications. However, computational costs may present challenges to effectively using PH in certain data contexts, and theoretical stability results may not hold in practice. In this dissertation, we develop an algorithm that can reduce the computation burden of computing persistent homology on point cloud data. Naming it Delaunay-Rips (DR), we define, implement, and empirically test this computationally tractable simplicial complex construction for computing persistent homology of Euclidean point cloud data. We demonstrate the practical robustness of DR for persistent homology in comparison with other simplical complexes in machine learning applications such as predicting sleep state from patient heart rate. To justify the theoretical stability of DR, we prove the stability of the Delaunay triangulation of a pointcloud P under perturbations of the points of P. Specifically, we impose a notion of genericity on the points of P to ensure stability. In the final chapter, we contribute to the field of computational biology by taking a data-driven approach to learn topological features of designed proteins from their persistence diagrams. We find correlations between the learned topological features and biochemical features to investigate how protein structure relates to features identified by subject-matter experts. We train several machine learning models to assess the performance of incorporating topological features into training with biochemical features. Using cover-tree differencing via entropy reduction (CDER), we identify distinguishing regions of the persistence diagrams of stable/unstable proteins. More notably, we find statistically significant improvement in classification performance (in terms of average precision score) for certain designed secondary structure topologies.
Model
Digital Document
Publisher
Florida Atlantic University
Description
The goal of this dissertation is to estimate the precise asymptotics for the number of geometric equivalence classes of Morse functions on the 2-sphere. Our approach involves utilizing the Lagrange inversion formula, Cauchy’s coefficient formula, and the saddle point method for the asymptotic analysis of contour integrals to analyze the generating function derived by L. Nicolaescu, expressed as the inverse of an elliptic integral. We utilize complex analysis, nonlinear functional analysis in infinite sequence spaces, and interval arithmetic to write all the necessary MATLAB programs that validate our results. This work answers questions posed by Arnold and Nicolaescu, furthering our understanding of the topological properties of Morse functions on two-dimensional manifolds. It also demonstrates the effectiveness of a computer assisted approach for asymptotic analysis.
Model
Digital Document
Publisher
Florida Atlantic University
Description
Storms in the North Atlantic Ocean are observed on a continual basis yearly. Storm trajectories exhibit random behavior and are costly to society. Data from the National Oceanic and Atmospheric Administration (NOAA) contains every storm’s track from the year 1851 to 2022. Data of each storm’s track can aid in decision making regarding their behavior. In this article, data analysis is performed on historical storm tracks during the years 1966 to 2022, where access to satellite information is available. Analysis on this data will be used to determine if the storms’ trajectory is statistically dependent on other storm’s trajectories at varying distances in space. The proposed model is a spatial statistical model that is fitted on an in-sample data set to determine the spatial relationship for storm trajectories at all pairwise directions or orientations. Afterwards, the model is assessed on an out-of-sample test data set for performance evaluation.