Theses
Permanent URI for this collectionhttps://uwspace.uwaterloo.ca/handle/10012/6
The theses in UWSpace are publicly accessible unless restricted due to publication or patent pending.
This collection includes a subset of theses submitted by graduates of the University of Waterloo as a partial requirement of a degree program at the Master's or PhD level. It includes all electronically submitted theses. (Electronic submission was optional from 1996 through 2006. Electronic submission became the default submission format in October 2006.)
This collection also includes a subset of UW theses that were scanned through the Theses Canada program. (The subset includes UW PhD theses from 1998 - 2002.)
Browse
Recent Submissions
Item type: Item , Decentralized and Agentic Spectrum Management in Cognitive Wireless Networks(University of Waterloo, 2026-05-11) Abognah, AnasDynamic spectrum management and sharing have been the subject of extensive research and development for many years. The ever-increasing demand for wireless spectrum from an exponentially growing number of devices and applications has led to a spectrum scarcity problem that remains unsolved. In addition, the rigid and prolonged nature of the regulatory processes of manually allocating spectrum has led to large swaths of spectrum bands being underutilized and inaccessible to new applications. Dynamic spectrum sharing can alleviate these problems by enabling new applications and devices to opportunistically access unused spectrum. Multiple spectrum sharing frameworks have been proposed by regulatory bodies where access to the shared spectrum is controlled and managed by a centralized third-party controller. However, these centralized spectrum sharing frameworks fail to provide truly dynamic and scalable spectrum sharing as they lack mechanisms for spectrum trading and do not provide incentives for primary users to participate in such models. In addition, existing decentralized spectrum management approaches rely on numerical optimization models that lack autonomous decision making capabilities, and are semantically blind and unable to interpret the unstructured regulatory policies and requirements. The need for a fully dynamic, and autonomous, spectrum sharing framework that satisfies the regulatory requirements and provides built-in economic incentives still exists. In this thesis, we propose and implement a fully decentralized spectrum management and sharing framework that resolves the issues inherent in the centralized model and closes the semantic gap through autonomous cognitive agents. We implement a comprehensive decentralized model that converges blockchain technology, federated learning, and Large Language Model (LLM) agents to automate and optimize dynamic spectrum sharing, sensing, and access in a single framework. The implemented model eliminates the reliance on centralized brokers through a two-tier Hyperledger Fabric blockchain network that guarantees trust, transparency, and immutable audit trails for spectrum sharing while eliminating single points of failure. In addition, the model facilitates cooperative decentralized spectrum sensing via federated model training on the blockchain achieving 92% detection accuracy. Finally, we implement BLAST (Blockchain LLM Agentic Spectrum Trading), which eliminates static decision-making and requirements analysis through autonomous cognitive agents. We demonstrate that LLM-driven agents employing game-theoretic reasoning within second-price sealed-bid auctions maximize social welfare and spectrum allocation efficiency and significantly outperform traditional heuristic strategies and state-of-the-art non-LLM decentralized models. This research establishes a concrete architectural blueprint for 6G and beyond, where decentralized intelligence, economic incentives, and regulatory compliance coexist within a unified, autonomous execution framework.Item type: Item , Putting Humpty-Dumpty back together: characterizing coherent recombination in Stern-Gerlach interferometers(University of Waterloo, 2026-05-11) Meng, DannyI show that superficially similar implementations of Stern-Gerlach Interferometers (SGIs) are expected to differ dramatically in their sensitivity to fields transverse to the primary acceleration direction. These transverse fields unavoidably accompany any static magnetic or electric field gradients, and have been shown by Comparat [Phys. Rev. A101, 023606 (2020)] to limit the precision application of SGIs. As a concrete example, I consider SGIs with ultracold Rb Rydberg atoms accelerated by spatially-varying electric fields and find that the deleterious effect of transverse fields imply that only some implementations (sequences of field gradients, internal state swaps and so-on) may exhibit fringes with high visibility. I further show that these differences are not strongly dependent on the form of the initial state. I provide a derivation of the Humpty-Dumpty equation for a general initial state and show that it holds for any interferometry sequence where the force as a function of time is piecewise constant. A modified version of the equation is shown to hold for any general sequence with a linear potential. I then extend this analysis to the transverse components of the described SGI, and give a form for the time evolution operator that is analogous to the one used in deriving the Humpty-Dumpty equation.Item type: Item , Controlling Metabolic Flux in Vacuum-Assisted Fermentation: The Interplay of pH and Operating Mode in the Valorization of Glucose to Volatile Fatty Acids and Biofuels(University of Waterloo, 2026-05-11) Hooshmand, MasoomehOver the last two decades, anaerobic digestion has been increasingly implemented for organic waste treatment and production of biogas and electric energy which can be exploited beyond the plant boundaries. However, this process has limitations including high capital costs, low revenues from energy recovery, and the generation of nutrient-rich streams which require further treatment. Hence, anaerobic digestion was modified to go beyond energy recovery through the production and recovery of higher-value products via anaerobic fermentation. A newly proposed process called IntensiCarb™ (IC) applies vacuum-assisted fermentation to enable process intensification, enhance the resource recovery and implement circular economy policies. However, a critical knowledge gap exists regarding the influence of key operational parameters on its performance outcomes. This research addressed this uncertainty by systematically investigating the effects of pH (5.5, 7.0, and 9.0) and vacuum operating mode (sequential evaporation and intermittent evaporation) on the fermentation of glucose in lab-scale, semi-continuous reactors. The results demonstrated that pH regulated metabolic flux, inducing a shift from hydrogen- and butyrate-producing pathways to ethanol- and acetate-dominant fermentation as pH increased from 5.5 to 9.0. The vacuum-enhanced modes intensified the process, operating at double the organic loading rate of the conventional system. However, performance—evaluated based on COD-normalized product yields—was highly dependent on the interaction between pH and operating mode. At neutral and alkaline pH, a clear performance hierarchy was established against the baseline reference points: the intermittent evaporation-fermentation (IEF) mode yielded more total volatile fatty acids (TVFA) and hydrogen than both the sequential evaporation-fermentation (SEF) mode and conventional fermentation (CF). For instance, at pH 9.0, the IEF mode achieved a maximum TVFA yield of 63.0 ± 1.1%, outperforming both the SEF (60.1 ± 1.7%) and CF (56.7 ± 1.2%) baselines. This superior performance was attributed to the IEF mode’s ability to alleviate thermodynamic limitations (e.g., inhibitory hydrogen partial pressure) through more frequent, in-situ product removal compared to the end-of-cycle removal in SEF. Critically, this trend reversed under acidic conditions (pH 5.5), where CF produced a higher TVFA yield (64.6 ± 2.3%) than either IEF (53.2 ± 1.0%) or SEF (51.7 ± 1.0%). This antagonistic interaction was attributed to heightened product inhibition from the accumulation of undissociated VFAs under the intensified IC conditions. These findings reveal that while the IC process is a powerful platform for targeted chemical production, its performance is dictated by the interplay between pH and process intensification, which must be carefully managed to avoid inhibitory effects and maximize resource valorization.Item type: Item , Using GIS Spatial Analysis to Investigate Burial Complexities and Variation at Wadi Faynan 100, Jordan(University of Waterloo, 2026-05-11) Schwarz, Maegen-RoseThis research investigates the spatial and stylistic relationships between EBA Ib burials that were excavated during the 2019 and 2023 field season of the Barqa Landscape Project (BLP), located in Wadi Faynan, Jordan. Spatial analyses included Near analysis, Hot spot analysis, and Inverse weight distributed (IDW) analyses which were conducted through the use of ArcPro. Results showed an increased level of grave good variety present in the burials located on the southern hills of WF100. Emerging grave pairs (Graves 1 & 2, 4 & 5, 6 & 7, 8 & 9, and 10 & 11) were found to hold reciprocated proximity ranking. There was no apparent sign of hot or cold spot presence. Comparing the structures to one another and to external EBA Ib burial grounds, stylistic similarities between graves include external stone wall lining, flat floor stones, built in alcoves in the Northern end, and grave openings to the North while the burials run North - South. Similarities between Wadi Faynan 100 and other EBA Ib sites in the Southern Levant include grave orientation, flat floor stones and artifacts. Results conclude that the Wadi Faynan 100 burial ground displays a level of connection between graves, while also affording room for stylistic difference. While sharing stylistic components of other EBA Ib burial grounds found within the southern Levant, WF100 maintains its own individuality in terms of construction and patterns.Item type: Item , Safety Risk Framework for Autonomous Driving Systems Under Perception Degradation(University of Waterloo, 2026-05-08) Delcore, SpencerEnd-to-end autonomous driving systems demonstrate impressive capabilities under normal conditions but exhibit unpredictable safety degradation when perception is compromised by adverse weather, image distortions, or sensor failures. Existing safety assessment methods fail to capture how perception degradation propagates through these black-box architectures and impact driving decisions, creating a fundamental gap in deployment readiness evaluation. This thesis presents a three-activity framework for quantifying and predicting safety risk in end-to-end autonomous driving systems under conditions where poor perception significantly affects safety. Activity 1 performs data augmentation and model execution through four sub-activities: dataset creation, model evaluation, safety assessment via the Total Risk metric, and feature extraction. The Total Risk metric aggregates incident severity through three components: time-to-incident severity, distance severity, and impact severity. Activity 2 trains a machine learning predictor on spatial and temporal features extracted from auxiliary task outputs of the model to determine expected Total Risk deviations from the Total Risk under nominal conditions. Activity 3 deploys the trained predictor for real-time safety degradation detection, enabling proactive intervention strategies. Experiments across two architectures, UniAD and ST-P3, under 24 different perception-relevant conditions spanning weather, lighting, sensor failures, and image distortions, reveal several key insights for system designers. Analysis across four test suites uncovers asymmetric generalization where predictors trained on severe conditions generalize reliably to less severe conditions, while the reverse fails catastrophically. Sensor failure conditions, such as camera crashes, create fundamentally different system failure modes depending on model architecture (UniAD does not produce any detections while ST-P3 generates noisy hallucinations) requiring architecture-specific safety measures. Performing feature importance analysis demonstrates that early-stage auxiliary tasks provide the most informative degradation signals. This framework provides system designers with quantitative tools for architecture design, training data prioritization, and known deployment limitations, advancing the safety assurance of end-to-end autonomous driving systems.Item type: Item , Microbial Methane Oxidation and Community Dynamics in Southern Ontario Landfill Cover Soils(University of Waterloo, 2026-05-08) Willms, NathanaelMethane emissions from municipal solid waste landfills contribute to ~10% of global methane emissions, significantly contributing to climate change. Methanotrophic bacteria present in landfill cover soils (LCS) can mitigate these emissions by oxidizing methane. In capped landfills, surface methane emissions are often concentrated at highly emissive ‘hot spots’. The spatial distribution and overall levels of methane emissions vary over daily, seasonal, and multi-year timescales, driven by landfill age and meteorological factors. Methanotrophic community composition in LCS also varies spatially and temporally in response to methane emissions, soil moisture content, temperature, pH, and nutrient availability. The research presented in this thesis surveys community composition, methane flux, and soil chemistry across a decommissioned capped landfill cover over a total period of 8 years. The overall objectives were to assess how soil bacterial and archaeal communities changed in response to shifting methane dynamics, as well as the role of key physicochemical factors in driving changes in the relative abundances of methanotrophic taxa. Surveying methane flux and soil methane concentrations at different sites across the landfill, I observed a long-term trend of disappearing methane at several former hot spots, concurrently with alterations to the landfill gas capture system that also saw two new hot spots emerge in a different area of the landfill. Active hot spots were lower in total nitrogen, NO3-, NO2-, and NH4+ relative to other sites, indicating that methanotrophy was likely N-limited in this landfill cover. Methane flux was significantly correlated with shifts in community composition, as were soil moisture content, pH, dissolved organic carbon, Ca2+, K+, Na+, and Cl-. Community profiling with 16S rRNA gene amplicon sequencing identified that active hot spots were dominated by methane oxidizing bacteria, especially by members of genera Methylobacter, Crenothrix, and Methylomicrobium, whereas sites without exposure to methane had much more diverse communities with methanotrophs constituting <1% of the community in all but one instance. Former hot spots, which experienced high methane emissions in 2020 that declined by 2022, saw an overall decline in total methanotroph relative abundance, but still maintained a higher relative abundance of methanotrophs than sites that had never been hot spots. Not all methanotrophic ASVs at these sites declined at the same rate, with some ASVs maintaining approximately the same relative abundance as during high methane efflux. Additionally, methane flux at the former hot spots was significantly more negative than at completely inactive sites, indicating oxidation of near-atmospheric concentrations of methane. These findings indicate the potential for “persistent” methanotrophs which, following methane enrichment, can survive at high proportions of a soil microbial community for long periods of time when methane availability becomes reduced and/or infrequent. Methanotrophs which pursue this ecological strategy could be crucial to the mitigation of methane emissions from older landfills with lower background levels of methane but sporadic, more intense efflux events, and could be employed in strategic microbial amendments to LCS.Item type: Item , Human Remote Sensing In Long-Term Care Facility Using Low-Cost FMCW Radar(University of Waterloo, 2026-05-08) Trinh, HuyThis thesis studies privacy-preserving human remote sensing in Long Term Care (LTC) environments using low-cost 60 GHz FMCW radar. The main challenge we are addressing is the reliable sensing of weak or quasi-static human states in LTC environments, such as quiet occupancy, prolonged sitting or lying, and post-fall floor presence, which are clinically and operationally important yet difficult to detect with conventional methods. These scenarios are precisely the cases in which sensing reliability is most critical for resident safety, caregiver response, and building operation. The thesis, therefore, investigates how far a low-cost, low-resolution radar platform can be pushed through signal processing, machine learning methods, and simulation-driven data generation to deliver useful roomlevel awareness without relying on cameras or wearables. The work is organized around a practical deployment view rather than a single algorithmic contribution. It begins with the sensing hardware and baseline signal-processing chain, then develops methods for quasistatic occupancy detection and post-fall floor-occupancy detection, extends these ideas to imaging-style radar representations suitable for edge deployment, and finally studies simulation and digital-twin approaches for sim-to-real radar learning. Across these working stages, the unifying theme is the design of scalable, privacy-preserving, and energy-aware radar sensing methods for ambient assisted-living environments.Item type: Item , Making Meaning: Translating Traditional Korean Pattern Through Digital Fabrication(University of Waterloo, 2026-05-08) Gu, JamieThis thesis examines how patterns derived from Korean architectural references can be translated through contemporary fabrication without losing the relationships that make them spatially and culturally meaningful. Rather than treating pattern as a detachable surface effect, the research understands it as an architectural condition that mediates boundary, light, visibility, enclosure, and material presence through density, continuity, edge condition, thickness, and relief. Beginning from the observation that pattern changes when the conditions of making change, the thesis asks not whether a historical artefact can be reproduced exactly, but whether culturally grounded patterned relationships can remain legible through contemporary making. Through fieldwork in South Korea, pattern redrawing, and comparative prototyping across laser cutting, CNC milling, 3D printing, and mould-based casting, the research tests what survives translation, what transforms, and which design decisions preserve patterned coherence. The work evaluates these translations through criteria including density, edge clarity, continuity, relief, assembly logic, and threshold performance, considering both spatial effect and cultural continuity. The findings suggest that meaning does not depend on exact replication, but on whether key relationships among geometry, material, making, assembly, and spatial effect remain coherent through adaptation. The thesis therefore frames translation as a design responsibility and craft as a form of situated judgement exercised through both digital and material processes of making.Item type: Item , A Comprehensive Evaluation Framework for Synthetic ECG: Assessing Fidelity, Utility, and Privacy(University of Waterloo, 2026-05-08) Li, Yixinsynthetic time-series data synthetic data generation privacy-enhancing technology time-series data evaluation electrocardiogram dynamic time warping membership inference attacksItem type: Item , Towards Explainability for Language Models in Security Testing(University of Waterloo, 2026-05-07) Hadfield, CameronModern generative Language Models (LMs) present as black boxes, requiring significant trust in their capabilities and making it difficult to understand the reasoning behind their decisions. As these LMs are increasingly used for code and test-case generation, testers must trust them without knowing what drives the model's outputs. To improve accuracy, modern LMs rely on supplementary documentation, such as Retrieval-Augmented Generation (RAG), or other content directly provided in their prompts to enhance background knowledge. When testers use LM-generated test cases for other purposes, such as fuzz testing, they must place greater trust in their quality, as seed cases can significantly affect fuzzer coverage performance. We adapt existing methods to build an analysis pipeline that explains document retrieval when the LM relies on documentation to generate test cases. We achieve this with only black-box access to the LMs under test. We use RFC-959 (the File Transfer Protocol (FTP) protocol) and two synthetic protocols to isolate the LM's reliance on data in its RAG system. Statistical analysis shows that the explanations from our pipeline capture real phenomena rather than random data. To aid integration with automated security testing, we present a formal definition of protocol communication. This formalism helps map our pipeline's features to the protocol domain and lays a foundation for future work with fuzzers. The explanations our pipeline generates yield plausible results, with some unexpected outputs, suggesting the need for tuning to improve explanations.Item type: Item , Computational Structural Biology in Modern Integrative Discovery Pipelines(University of Waterloo, 2026-05-07) Jofily de Lima Rangel, PaulaComputational structural biology (CSB) and computer-aided drug design enable research and discovery pipelines that drive scientific innovation. The integration between development of CSB pipelines, application of state-of-the-art methods, and experimental collaboration constitute the current, modern paradigm in drug development. Emerging and rapidly developing fields in drug discovery involve novel classes of small molecule therapeutics that escape the boundaries of classical inhibition mechanisms. In this context, Proteolysis Targeting Chimeras (Protacs) are a promising drug modality of bifunctional compounds that promote the degradation of a protein of interest by triggering endogenous ubiquitin-proteasome signalling. In addition, protein-protein and protein-peptide interface design are methods central to state-of-the-art protein engineering campaigns. Extreme throughput mutagenesis through computational modelling enables the vast exploration required for true novelty in fields such as inhibitor design, antibody-antigen recognition, and specificity and affinity engineering. This work presents novel tools and scientific findings in a research model that follows the state-of-the-art CSB paradigm of integrative development, application, and experimental collaboration. We present a fully automated and accurate Protac ternary complex modelling platform; an extreme high-throughput interface mutagenesis tool to scan tens of millions of mutants in a physics-informed manner; and an automated molecular dynamics pipeline that introduces a unique, truly gentle protocol for system equilibration. In turn, these tools were applied to research questions involving prominent disease targets to guide posterior experimental efforts or explain previous experimental findings. The work presented in this thesis also provides a framework for understanding CSB's ever growing place of importance in current innovative scientific research.Item type: Item , Sustainability Practices and Financial Performance in Low-Cost Airlines: A Panel Analysis Using a Sustainability Disclosure Index(University of Waterloo, 2026-05-06) Goyal, SiddhantThe aviation industry faces growing pressure to address its environmental and social impacts as climate regulations tighten and stakeholders, including investors, regulators, and civil society, demand greater transparency from firms operating in emissions-intensive sectors. Although sustainability disclosure has expanded significantly across industries, its financial implications remain contested, particularly in sectors where competitive advantage depends on cost efficiency. Low-cost carriers provide a distinct context for examining this relationship because their business models prioritize cost leadership, high aircraft utilization, and lean operational structures. In such settings, firms face institutional pressure to communicate sustainability commitments while operating under structural constraints that limit discretionary investment. Drawing on stakeholder theory, and specifically the salience framework developed by Mitchell et al. (1997), this study conceptualizes sustainability disclosure as a selective organizational response to differentiated stakeholder demands, where the depth and structure of disclosure reflect which stakeholder pressures are most financially proximate and most compatible with the operating logic of the low-cost business model. Using sustainability and annual reports from 20 global low-cost carriers between 2015 and 2024, the study develops a Sustainability Disclosure Index (SDI) to measure the depth of Environmental, Social, and Governance (ESG) disclosure. The SDI is constructed through systematic content analysis of corporate reports, applying a structured coding framework aligned with Sustainability Accounting Standards Board (SASB) sustainability topics. The resulting dataset is used to examine whether sustainability disclosure depth is associated with financial performance through panel regression analysis. The findings indicate that while sustainability disclosure depth among low-cost carriers has increased steadily over the study period, greater overall disclosure depth is not associated with improved financial performance. Disaggregated analysis reveals that environmental disclosure is positively associated with return on assets, consistent with the interpretation that investor and regulatory demands converge most directly on the environmental dimension and that the underlying practices overlap with the operating economics of the low-cost model. Social disclosure is associated with a negative short-run relationship with profitability in the full sample, a pattern concentrated in financially distressed airline-years rather than representing a general feature of the LCC segment. Governance disclosure shows no statistically significant association with financial performance, consistent with its threshold-driven nature and limited cross-firm variation. The results suggest that the financial relevance of sustainability disclosure in the low-cost airline industry is conditional on which stakeholder demands the underlying practices address and how closely those demands are connected to the operating economics of the business model.Item type: Item , Probabilistic Verification of Quantum Devices Under Finite Measurement Resolution and Adversarial Disturbances(University of Waterloo, 2026-05-06) Rosas-Bustos, JoseThe development of practical quantum devices is transitioning from laboratory-scale demonstrations to engineered systems intended for integration, deployment, and sustained operation. As quantum hardware increases in complexity and scale, establishing reliable verification procedures under realistic operating conditions becomes a central engineering challenge. Real quantum devices operate under finite measurement resolution, estimator tolerances, drift, hardware constraints, and limited data, all of which fundamentally restrict what can be inferred from experimental observations. Consequently, verification strategies based on static thresholds, ideal measurements, or full microscopic reconstruction are often insufficient for deployment-grade systems. This thesis develops an engineering-oriented probabilistic framework for modelling and verifying quantum devices under finite measurement resolution and adversarial disturbances. Rather than treating verification as a binary decision, the work reframes it as a system-level inference problem governed by uncertainty, tolerances, and acceptance criteria. Across five manuscript-based studies, the thesis identifies fundamental verification vulnerabilities, introduces probabilistic modelling approaches, and develops practical mitigation, scaling, and governance-oriented strategies compatible with real hardware. First, the thesis shows that finite measurement resolution can create regions of operational indistinguishability in which conventional quantum integrity checks become statistically non-discriminating, even when the underlying theoretical assumptions remain valid. Building on this result, a probabilistic verification framework is introduced to model acceptance outcomes under uncertainty and to quantify confidence levels rather than rely on fixed thresholds alone. The thesis then develops an operational threat-modelling framework for adversarial disturbances in continuous-variable quantum communication, classifying structured interference into reconnaissance, exploratory, and denial-of-service regimes on the basis of receiver-observable statistics and finite-sample detectability. In response to such disturbances, phase-first modulation strategies are developed to show that static operating points can be inadequate under structured stress and that lightweight, hardware-compatible adaptations can improve resilience. To address scalability in large quantum systems, the Effective Mode Approximation is introduced as a reduced-order probabilistic verification framework for collective Hamiltonian behaviour, enabling system-level assessment without full mode-resolved reconstruction. Finally, a probabilistic forecasting framework is developed to model time-dependent cryptographic security degradation under evolving classical and quantum threat capabilities, extending verification concepts to strategic risk assessment and transition planning. Taken together, these contributions establish a unified probabilistic verification perspective grounded in robustness, scalability, and operational realism. The proposed methods align with established principles from control, signal processing, and system identification, and they provide practical tools for assessing trust, performance, and readiness of quantum hardware as it moves toward real-world deployment.Item type: Item , Towards Socket Testing Standardization: Advancing Mock Residual Limbs for Transtibial Prosthetic Socket Testing Through 6DOF Gait Simulation(University of Waterloo, 2026-05-06) Rossi, EricaThe transtibial prosthetic socket is a key component in below-knee prostheses. While technology used to create and augment sockets have been evolving, testing methods have not kept pace. The criterion standard for the testing of transtibial prostheses, ISO 10328, does not specify testing for the socket as a stand-alone unit beyond static compression testing to simulate heel-strike and toe-off components of gait. Additionally, the limb dummies or Mock Residual Limbs (MRLs) used in these testing methods to interface with, and impart loading on, the socket are poorly defined and typically made of non-compliant plaster or polyurethane. This lack of standardization, specifically socket testing, has been identified by the American Orthotics and Prosthetics Association Socket Guidance Workgroup as a major limitation and motivated this thesis. This research aimed to develop a mechanical cyclic gait testing setup utilizing a novel compliant MRL. Ten (10) MRL variations were developed, comprising of an aluminum threaded rod to simulate the tibia and a single or dual-layer of silicone to simulate the skin and soft tissues of the residuum. EcoFlex 00-30 and VytaFlex 30A silicones were selected as the MRL materials to replicate skin and tissue material properties and mechanical behaviour. Three novel 3D printed Spider structures were also developed (50 mm, 70 mm diameter, and Equidistant) and integrated with the tibia rod to promote material adhesion between the rod and the silicone and support load transfer by creating a mechanical interlock with the silicone. Affixed to the VIVO™ Joint Simulator, MRLs were seated into a transtibial prosthetic socket for uniaxial and multiaxial mechanical testing. The VIVO™ enabled custom cyclic gait waveforms representing realistic amputee knee joint forces and moments. These loads were imparted on the socket and enabled the evaluation of MRL mechanical properties. Uniaxial and multiaxial cyclic gait testing of the MRLs and socket system highlighted the ability of the Spider to increase the MRLs response to loading. The 50 mm Spider saw an improved moment loading response, and the 70 mm Spider improved force loading response when integrated with the MRLs. Additionally, increased stiffness of the single layer VytaFlex 30A silicone (compared to the single layer EcoFlex 00-30) allowed for an improved moment loading response of the MRL. The dual-layer (60:40) MRL of VytaFlex 30A and EcoFlex 00-30 further improved MRL force loading response. The top two performing MRLs recommended for further research are the VytaFlex 30A Core, and EcoFlex 00-30 Outer Shell 50 mm Spider MRL and the VytaFlex 30A 50 mm Spider MRL. By leveraging the ability of dynamic mechanical testing this research can act as a bridge between rudimentary bench testing and resource intensive clinical trials – revolutionizing the way socket testing is viewed and promoting more efficient technology development.Item type: Item , Load Variation Resilient and Average Efficiency Enhanced Power Amplifiers for 5G/6G Beamforming System(University of Waterloo, 2026-05-06) Yu, HangThe deployment of Fifth Generation (5G) and Sixth Generation (6G) infrastructure relies heavily on high-frequency beamforming architectures to deliver high data rates and spectral efficiency. However, the physical realization of these systems faces critical challenges: the need for high circuit integration, energy efficiency under high Peak-to-Average Power Ratio (PAPR) signals, and robustness against dynamic load variations inherent in large-scale arrays. This doctoral thesis addresses these requirements through the development of three advanced integrated circuit objectives, progressing from theoretical derivations in load variation resiliency (Voltage-Standing-Wave-Ratio (VSWR) resiliency) to front-end architectural synthesis for Power Amplifiers (PAs). To improve circuit integration and performance for Time-Division Duplex (TDD) operation in beamforming systems, Chapter 3 focuses on the co-design of a Transmit/Receive (T/R) Front-End Module (FEM). Traditional FEMs suffer from insertion loss and area overhead due to additional Single-Pole Double-Throw (SPDT) switches. To resolve this, this work presents an architecture that integrates a Doherty Power Amplifier (DPA) in the transmit path, which also functions as a switchless T/R isolation network during receive operation. On the receiver side, an embedded switching network maximizes isolation and bandwidth while jointly optimizing the overall FEM performance and integration trade-offs. A 39 GHz prototype was fabricated using the GlobalFoundries 45nm Silicon-On-Insulator (SOI) CMOS process and achieves a Transmit (TX) mode gain of 15 dB, a saturated output power of 20 dBm, and a Power-Added Efficiency (PAE) of 23%/15% at peak and 6-dB back-off, respectively. In Receive (RX) mode, it delivers 20 dB of gain, a 4.5-dB noise figure, and an input 1-dB compression power of -16.5 dBm while consuming 32 mW. Occupying a core area of just 0.5 x 0.75 mm^2, this architecture demonstrates a highly competitive efficiency-noise-integration trade-off, achieving state-of-the-art performance for high-frequency FEMs. While integration and performance improvements are critical for beamforming systems, the load variation induced by antenna mutual coupling in Large-Scale Antenna Arrays (LSAAs) presents another critical challenge. Chapter 4 proposes a dual-mode PA design, utilizing different gate biasing to reconfigure its operational state. The work features a 'VSWR resiliency mode' to maintain robust performance under high load mismatch, and an 'Output-Back-Off (OBO) efficiency enhancement mode' to maximize efficiency under minimal load variation. Central to this design is a novel combiner network synthesized to support two distinct operational regimes: it can emulate the characteristics of balanced architectures (symmetric drain currents) for load-variation resiliency, or Doherty load modulation (asymmetric drain currents) for efficient OBO operation. A 29 GHz prototype was fabricated using GlobalFoundries' 22nm Fully-Depleted SOI CMOS process to validate the concept. Under a 50-ohm load, the VSWR-resilient mode achieves 16.5-dB gain, 12.5-dBm output power, and 18%/7.5% PAE at peak/6-dB OBO. The OBO efficiency-enhanced mode delivers 13-dBm output power with 19%/12.5% PAE at peak/6-dB OBO. Across a load of 2.5:1 VSWR over a 360-degree phase range, the VSWR-resilient mode exhibits only 0.5-dB average saturated-power degradation compared to 1~dB in the OBO efficiency-enhanced mode. Modulated measurements under varying VSWR loads further confirm the superior load-variation tolerance of the proposed architecture. The above-mentioned dual-mode approach offers flexibility between VSWR resiliency and efficiency improvements; however, communication protocols often demand simultaneous efficient and robust operation. Consequently, Chapter 5 unifies these requirements by establishing the theory and design of a "VSWR-Resilient DPA," extending the analytical framework of Chapter 4 to ensure robust performance across multiple power regimes. The analysis yields architectures that maintain the OBO efficiency profile of a DPA while simultaneously delivering load-variation insensitivity against Multiple-Input-Multiple-Output (MIMO) beamforming array mismatch. A prototype targeting 8 GHz is designed using a commercial MACOM GaN bare-die transistor on a multi-layer PCB substrate; however, due to procurement delays, experimental validation is deferred to future work. In EM circuit co-simulation, the architecture achieves 10-dB SS gain, 45-dBm saturated output power, and 49%/35% PAE at peak/6-dB OBO under a 50-ohm load, while maintaining less than 1.5-dBm saturated power variation and 1.55× normalized Class-B efficiency at 6-dB OBO across different antenna loads on the 3:1 VSWR circle.Item type: Item , Sustainable Strategies for Arctic Lifelines: Funding Decisions in the Face of Climate-Change Uncertainty(University of Waterloo, 2026-05-06) GHOLAMI, HAMEDRemote Arctic communities rely on ephemeral winter roads for the affordable delivery of essential goods. As climate change destabilizes the physical foundation of these supply chains, policymakers face a complex stochastic allocation problem: how to optimally divide a constrained budget between supply-side infrastructure investments (to extend road duration) and demand-side consumer subsidies (to bolster household purchasing power). In this thesis, we develop a stylized stochastic optimization model to analyze this trade-off, formulating a capacity-budget gap parameter to capture the dual-bottlenecks of physical throughput versus financial liquidity. We prove that the optimal funding strategy, serving as a stochastic hedge, is strictly bounded between the supply-constrained and demand-constrained deterministic solutions. Through comparative statics, we uncover counter-intuitive operational tradeoffs, including a price-affordability tradeoff and a logistics efficiency tradeoff, where improvements in supply chain economics rationally trigger infrastructure divestment due to an underlying income saturation effect. Furthermore, we analyze the compound effect of climate change: a secular decline in mean winter duration combined with rising interannual volatility. We demonstrate that for the most vulnerable communities, this compound shock pushes the system into a dilution state where higher volatility counter-intuitively reduces the optimal investment level. Our findings suggest that as climate uncertainty accelerates and mean operational windows shrink, relying on winter road infrastructure becomes economically unsustainable, necessitating a strategic policy pivot toward direct income support and alternative logistics.Item type: Item , Feature Representation for Sea Ice Mapping(University of Waterloo, 2026-05-06) Noa Turnes, JavierSea ice monitoring is essential for climate research, Arctic navigation, and operational decision-making. Synthetic aperture radar (SAR) imagery is the primary sensing modality used by national ice services because of its independence of atmospheric and lighting conditions, and sensitivity to surface structure. However, SAR-based sea ice classification remains challenging due to spatially non-stationary statistics caused by incidence angle effects, seasonal transitions, and strong within-class variability. These factors complicate feature extraction and limit the robustness and transferability of conventional deep learning models. This thesis investigates feature representation learning for sea ice classification in SAR imagery through both supervised and self-supervised paradigms. The first contribution introduces a supervised semantic segmentation framework that integrates convolutional neural networks (CNNs), transformers, and unsupervised region segmentation. The proposed Irregular Tokens on Transformers (ITT) architecture forms multi-scale, homogeneous tokens using Iterative Region Growing on Semantics (IRGS) and applies self-attention to capture long-range spatial dependencies. A multi-task training scheme combines pixel-level and region-level loss functions, encouraging region-consistent feature representations while preserving fine-grained boundaries. Experiments on multi-season RADARSAT-2 scenes demonstrate improved overall accuracy, sharper boundary delineation, and reduced predictive uncertainty compared to a CNN baseline. An expert audit conducted by the Canadian Ice Service further supports the operational relevance and stability of the approach across freeze-up and melt conditions. While supervised learning delivers strong performance when annotations are available, SAR labeling remains costly and domain specific. The second contribution explores self-supervised pre-training toward a SAR foundation model for sea ice classification. By leveraging masked representation learning and multi-task objectives, the proposed framework learns transferable representations from unlabeled SAR imagery. The study evaluates whether large-scale pre-training alone is sufficient to address domain shifts across sensors and seasons, or whether task-specific adaptations remain necessary. Results show that self-supervised pre-training substantially improves downstream performance and generalization, but optimal accuracy is achieved when combined with structured fine-tuning aligned with sea ice semantics. Overall, this thesis demonstrates that robust sea ice classification fundamentally depends on how feature representations are learned, and provides principled strategies for improving scalability, generalization, and operational viability in Arctic SAR applications.Item type: Item , Leveraging Interactive Human–AI Collaboration Methods to Enhance Key Stages of Programming Workflows(University of Waterloo, 2026-05-06) Liu, XuyeBeyond writing code, programmers routinely move through several complementary tasks as they develop, refine, and share their work. These workflows typically involve recurring stages: understanding and documenting code, checking correctness and debugging, improving efficiency and scalability, and sharing results with others. Each stage has its own challenges: documentation often becomes outdated or inconsistent with evolving code, debugging can be time-consuming and opaque, performance improvements require balancing competing goals (e.g., speed, memory, and clarity), and communicating results usually demands extra manual effort. This thesis investigates how human–AI collaboration can support programmers across four key stages of the workflow. To address these challenges, I begin by studying the needs and practices of programmers to understand where current tools fall short. Based on these insights, I design interactive systems that integrate with common tools such as computational notebooks and IDEs and operate on invariant components (code cells, execution outputs, text) so results remain compatible with common practices. Across the four stages, these systems provide context-aware code understanding across multiple cells, purpose-driven documentation from code and its execution results for different communicative purposes, presentation slides from code and results, and real-time, multi-dimensional code evaluation and optimization support during development, with authors remaining in control to inspect, edit, and refine outputs throughout. I conduct user studies and case studies to evaluate system usability and to assess how these approaches improve programmers’ productivity, confidence, and ability to share their work.Item type: Item , Habitat Restoration Strategies for Eastern Meadowlark (Sturnella magna, L.) in Ontario(University of Waterloo, 2026-05-06) Atherton, ClaireEastern Meadowlark (Sturnella magna, L.) is an at-risk grassland bird in Ontario. S. magna is declining in part from breeding habitat loss and quality decline. Habitat restoration has been proposed as a recovery measure. Information on S. magna’s habitat preferences and current S. magna restoration initiatives are lacking in Ontario. I studied microhabitat characteristics within restored tallgrass prairie sites in Norfolk County, Ontario. No S. magna were observed on the study sites. I compared microhabitat characteristics between potential nest attempt periods, between sites, and to time since disturbance. Visual obstruction was the only characteristic that differed between nest attempts (p < 0.05). All microhabitat characteristics differed between sites (p < 0.05). All characteristics except woody vegetation cover (dCor = -0.00038, p = 0.52) showed a correlation with time since disturbance. The levels of significance for all tests were determined to be artefacts of the small sample size and do not necessarily reflect true trends. Comparisons to the literature suggest that percent grass cover may have been too low and percent total cover too high to support S. magna, but results differed and were too sparse to make meaningful comparisons. Likely not enough time has passed since restoration for the sites to become suitable for S. magna. I distributed an online questionnaire to 334 people knowledgeable of S. magna, tallgrass prairies, and/or grassland birds about past, current, and future restoration strategies in Ontario. Thirty-five responses were received. Projects have occurred across southern Ontario with clusters near Windsor and in Northumberland County.Delayed hay harvesting was the most common management strategy in restored areas. About half of respondents indicated that post-restoration monitoring occurs at least some of thetime. Sixty-three percent of respondentsindicatedthat projects used interventions.Most respondents indicated that projects lacked in sufficiency and effectiveness. Limiting factors included finances and maintenance, and strengths included having a broad focus, planning, and monitoring. Key targets for future projects were southern and eastern Ontario. Key targets for future research were a better understanding of S. magna’s habitat needs and lifecycle, responses to restoration, and use of anthropogenic grasslands. This thesis provides an overview of the state ofS. magnarestoration in Ontario. It provides a roadmap for restoration ecologists and conservation biologists to use when managing habitat for grassland birds in Ontario.Item type: Item , Surface Wave Propagation using Ray Optics with Applications to Hamilton Harbour(University of Waterloo, 2026-05-05) Bhavsar, KhushSurface gravity waves are a commonly observed phenomenon in closed lakes. They are primarily generated by winds, but may also be generated by wakes of vessels in the water or tidal forces. These waves are known to undergo refraction and changes in energy (and therefore, amplitude and speed) as they propagate over variable depths in a closed lake. In this thesis, we use the linearized ray optics equations to study the propagation of surface waves in closed lakes with variable bathymetry. We assume that the fluid is inviscid, irrotational and incompressible. We further assume that the wavelength of these waves is much smaller than the length scales over which bathymetry (depth) varies – in other words, we assume the waves propagate in a slowly varying environment. We perform a series of numerical simulations of surface wave propagation with various initial conditions and bathymetries to validate the model, as well as gain insights on the effects of roughness of the bathymetry in diverting the expected trajectories of these waves. Finally, we present a case study on Hamilton Harbour. This watershed has been under scrutiny for several years due to rising eutrophication levels affecting the dissolved oxygen levels in the basin. We apply the numerical model to the bathymetry of Hamilton Harbour to locate hotspots of wave accumulation in the lake, which may provide inference of regions in the lake where we expect sediment resuspension which is one of the primary factors of internal loading of phosphorus in lakes.