Department of Physics

Related Entities
Member of: Graduate College
Model
Digital Document
Publisher
Florida Atlantic University
Description
An algorithm to determine IMRT optimization parameters within the Elekta Monaco® treatment planning system that increases dose homogeneity and dose conformity in the planning target volume was developed. This algorithm determines IMRT optimization parameters by calculating the difference between two pairs of dose points along the target volume’s dose volume histogram: Dmax – Dmin, and D2 – D98. The algorithm was tested on the Elekta Monaco® Treatment Planning System at GenesisCare of Coconut Creek, Florida using CT data from 10 anonymized patients with non-small cell lung cancer of various tumor sizes and locations. Nine iterations of parameters were tested on each patient. Once the ideal parameters were found, the results were evaluated using the ICRU report 83 homogeneity index as well as the Paddick conformity index. As an outcome of this research, it is recommended that at least three iterations of IMRT optimization parameters should be calculated to find the ideal parameters.
Model
Digital Document
Publisher
Florida Atlantic University
Description
The Monaco treatment planning system offers three different dose calculation algorithms for use in calculating 3D treatment plans. These include Monte Carlo (MC), Collapsed Cone (CC) and the pencil beam algorithms. The aim of this study is an in-depth analysis of Monte Carlo and Collapsed Cone dose calculation methods to find the optimal parameters for clinical use for both algorithms.
An end-to-end phantom with inhomogeneities was scanned and the DICOM images were imported into Monaco for contouring and planning. Treatment plans were then created in Monaco for both MC and CC using different permutations of variables for approximately 400 plans. These variables include CT Slice thickness, grid size, statistical uncertainty, and beam energy. Following planning the end-to-end phantom was then irradiated on an Elekta Linac and plans for each beam energy were created. Clinical beam data was then compared to the computed plans for each dose calculation method.
Model
Digital Document
Publisher
Florida Atlantic University
Description
Unveiling the secrets of gravity necessitates numerical relativity simulations of gravitational systems, as observations made by gravitational wave detectors expect an interpretation. In the other hand, these numerical simulations require physical and constraint-satisfying initial data. Therefore, the accuracy of simulations go hand in hand with the accuracy of initial data. As such, constructing accurate initial data is an indispensable task and it is the very subject of this dissertation.
Here, we present the newly developed pseudospectral code Elliptica, an infrastructure for construction of initial data for various binary and single gravitational systems of all kinds. The elliptic equations under consideration are solved on a single spatial hypersurface of the spacetime manifold. Using coordinate maps, the hypersurface is covered by patches whose boundaries can adapt to the surface of the compact objects. To solve elliptic equations with arbitrary boundary condition, Elliptica deploys a Schur complement domain decomposition method with a direct solver. In this version, we use cubed sphere coordinate maps and the fields are expanded using Chebyshev polynomials of the first kind. Here, we explain the building blocks of Elliptica and the initial data construction algorithm for black hole-neutron star binary systems. We perform convergence tests and evolve the data to validate our results. Within our framework, the neutron star can reach spin values close to breakup with arbitrary direction, while the black hole can have arbitrary spin with dimensionless spin magnitude ~ 0.8.
Model
Digital Document
Publisher
Florida Atlantic University
Description
Medical professionals use CT images to get information about the size, shape, and location of any lung nodules. This information will help radiologist and oncologist to identify the type of cancer and create a treatment plan. However, most of the time, the diagnosis regarding the types of lung cancer is error-prone and time-consuming. One way to address these problems is by using convolutional neural networks. In this Thesis, we developed a convolutional neural network that can detect abnormalities in lung CT scans and further categorize the abnormalities to benign, malignant adenocarcinoma and malignant squamous cell carcinoma. Our network is based on DenseNet, which utilizes dense connections between layers (dense blocks), so that all layers are connected. Because of all layers being connected, different layers can reuse features from previous layers which speeds up the process and make this network computationally efficient. To retrain this network we used CT images for 314 patients (over 1500 CT images) consistent of 42 Lung Adenocarcinoma and 78 Squamous Cell Carcinoma, 118 Non cancer and 76 benign were acquired from the National Lung Screening Trial (NLST). These images were divided to two categories of Training and Validation with 70% being training dataset and 30% as validation dataset. We trained our network on Training dataset and then checked the accuracy of our model using the validation dataset. Our model was able to categorize lung cancer with an accuracy of 88%. Afterwards we calculated the the confusion matrix, Precision (Sensitivity), Recall (Positivity) and F1 score of our model for each category. Our model is able to classify Normal CT images with Normal Accuracy of 89% Precision of 94% and F1 score of 93%. For benign nodules Accuracy was 92% precision of 97% and F1 score 86%, while for Adenocarcinoma and squamous cell cancer the Accuracy was 98% and 93%, Precision 85% and 84% and F1 score 92% and 86.9%. The relatively high accuracy of our model shows that convolutional neural networks can be a valuable tool for the classification of lung cancer, especially in a small city or underdeveloped rural hospital settings and can play a role in achieving healthcare equality.
Model
Digital Document
Publisher
Florida Atlantic University
Description
We introduce a novel geometric approach to characterize entanglement relations in large quantum systems. Our approach is inspired by Schumacher’s singlet state triangle inequality, which used an entropic-based distance to capture the strange properties of entanglement using geometric-based inequalities. Schumacher uses classical entropy and can only describe the geometry of bipartite states. We extend his approach by using von Neumann entropy to create an entanglement monotone that can be generalized for higher dimensional systems. We achieve this by utilizing recent definitions for entropic areas, volumes, and higher dimensional volumes for multipartite which we introduce in this thesis. This enables us to differentiate systems with high quantum correlation from systems with low quantum correlation and differentiate between different types of multi-partite entanglement. It also enable us to describe some of the strange properties of quantum entanglement using simple geometrical inequalities. Our geometrization of entanglement provides new insight into quantum entanglement. Perhaps by constructing well motivated geometrical structures (e.g. relations among areas, volumes ...), a set of trivial geometrical inequalities can reveal some of the complex properties of higher-dimensional entanglement in multi-partite systems. We provide numerous illustrative applications of this approach.
Model
Digital Document
Publisher
Florida Atlantic University
Description
Dosimetric uncertainty in very small (< 2 x 2 cm2) photon fields is notably higher that has created research questions when using small-field virtual cone with variable multileaf collimator (MLC) fields. We evaluate the efficacy of the virtual cone with a fixed MLC field for stereotactic radiosurgery (SRS) of small targets such as trigeminal neuralgia.
We employed a virtual cone technique with a fixed field geometry, called fixed virtual cone (fVC), for small target radiosurgery using the EDGE (Varian Medical Systems, Palo Alto, CA) linac. The fVC is characterized by 0.5 cm x 0.5 cm high-definition MLC field of 10 MV flattening filter-free (FFF) beam defined at 100 cm SAD, while jaws are positioned at 1.5 cm x 1.5 cm. A spherical dose distribution equivalent to 5 mm cone was generated by using 10–14 non-coplanar partial arcs. The dosimetric accuracy of this technique was validated using the SRS MapCHECK (Sun Nuclear Corporation, FL) and the EBT3 (Ashland Inc., NJ) film based on absolute dose measurements. For the quality assurance (QA), 10 treatment plans for trigeminal neuralgia consisting of various arc fields at different collimator angles were analyzed retrospectively using 6 MV and 10 MV FFF beams, including the field-by-field study (n = 130 fields). Dose outputs were compared between the SRS MapCHECK measurements and Eclipse treatment planning system (TPS) with Acuros XB algorithm (version 16.1). In addition, important clinical parameters of 15 cases treated for trigeminal neuralgia were evaluated for the clinical performance. Moreover, dosimetric (field output factors, dose/MU) uncertainties considering a minute (± 0.5–1.0 mm) leaf shift in the field defining fVC, were examined from the TPS, SRS diode (PTW 60018) measurements, and Monte Carlo (MC) simulations.
Model
Digital Document
Publisher
Florida Atlantic University
Description
Quantum Gravity attempts to unify general relativity (GR) and quantum theory, and is one of the challenging research areas in theoretical physics. LQG is a background-independent and non-perturbative approach towards the theory of quantum gravity. The spinfoam formulation gives the covariant path integral formulation of LQG.
The spinfoam amplitude plays a crucial role in the spinfoam formulation by defining the transition amplitude of covariant LQG. It is particularly interesting for testing the semiclassical consistency of LQG, because of the connection between the semiclassical approximation of path integral and the stationary phase approximation. The recent semiclassical analysis reveals the interesting relation between spinfoam amplitudes and the Regge calculus, which discretizes GR on triangulations. This relation makes the semiclassical consistency of covariant LQG promising. The spinfoam formulation also provides ways to study the n-point functions of quantum-geometry operators in LQG.
Despite the novel and crucial analytic results in the spinfoam formulation, the computational complexity has been obstructed further explorations in spinfoam models. Nevertheless, numerical approaches to spinfoams open new windows to circumvent this obstruction. There has been enlightening progress on numerical computation of the spinfoam amplitudes and the two-point function. The numerical technology should expand the toolbox to investigate LQG.
Model
Digital Document
Publisher
Florida Atlantic University
Description
Through the variational principle, we review the gravitational field equations in Einstein gravity and modified f(R) gravity theories. Metric and Palatini formalisms are two different approaches that are employed to obtain the field equations in the context of f(R) theory of gravity. In this framework, we attempt to investigate the energy conditions in Friedmann-Lemaitre-Robertson-Walker (FLRW) metric using the Raychaudhuri equation. Then, we focus on wormhole geometries and their thermodynamics behavior in Palatini and metric versions of modified f(R) gravity, separately. To violate the null and the weak energy conditions, wormhole spacetimes need an exotic matter. It has been shown that in f(R) gravity the matter threading the wormholes serves the energy conditions, and it is the derivative terms of the higher order curvature that may be explained as a gravitational fluid, that supports these geometries. Therefore, in f(R) gravity theory it is not required to introduce exotic matter in order to have traversable wormholes. In the framework of metric and Palatini f(R) gravity, we investigate the thermodynamic properties of evolving wormholes. We obtain an expression for the variation of the total entropy to discuss the thermodynamic behavior of wormhole spacetimes. The investigation has been extended to the apparent and event horizons. Eventually, we apply the radius of these horizons to determine the validity of the generalized second law of thermodynamics. This law states that the rate of change of total entropy is positive.
Model
Digital Document
Publisher
Florida Atlantic University
Description
In proton therapy systems with pencil-beam scanning, output of Halo effect is not necessarily included in Treatment Planning System (TPS). Halo effect (low-intensity tail) can significantly affect a patient’s dose distribution. The output of this dose depends on the field size being irradiated. Although much research has been made to investigate such relation to the field size, the number of reports on dose calculations including the halo effect is small. In this work we have investigated the Halo effect, including field size factor, target depth factor, and air gaps with a range shifter for a Varian ProBeam.
Dose calculations created on the Eclipse Treatment Planning System (vs15.6 TPS) are compared with plane-parallel ionization chambers (PTW Octavius 1500) measurements using PCS and AcurosPT MC model in different isocenters: 5cm, 10cm, and 20cm. We find that in AcurosPT algorithm deviations range between -7.53% (for 2cm field in 25cm air gap with range shifter) up to +7.40% (for 20cm field in 15cm air gap with range shifter). Whereas, in PCS algorithm the deviations are -2.07% (for 20x20cm field in open conditions) to -6.29% (for 20x20cm field in 25cm air gap with range shifter).
Model
Digital Document
Publisher
Florida Atlantic University
Description
One of the most common types of cancer among women is breast cancer. It represents one of the diseases leading to a high number of mortalities among women. On the other hand, prostate cancer is the second most frequent malignancy in men worldwide.
The early detection of prostate cancer is fundamental to reduce mortality and increase the survival rate. A comparison between six types of machine learning models as Logistic Regression, Decision Tree, Random Forest, Gradient Boosting, k Nearest Neighbors, and Naïve Bayes has been performed. This research aims to identify the most efficient machine learning algorithms for identifying the most significant risk factors of prostate and breast cancers. For this reason, National Health Interview Survey (NHIS) and Prostate, Lung, Colorectal, and Ovarian (PLCO) datasets are used. A comprehensive comparison of risk factors leading to these two crucial cancers can significantly impact early detection and progressive improvement in survival.