Executive Summary : | semiconductor chips are crucial components of various electronic systems, including smartphones, cloud servers, cars, critical infrastructure, and defense systems. With the rise of data-intensive applications like BIG Data, Artificial Intelligence, and IoT, the semiconductor industry is seeking energy-efficient, compact transistors with long lifespans. Nanosheet Field Effect Transistors (FETs) are considered the next-generation technology for sub-3nm technology nodes. However, their confined nature creates reliability issues, such as carrier mobility degradation, threshold voltage shift, negative output conductance, and other aging problems, reducing transistor lifespan. The self-heating effect in Nanosheet FETs can lead to carrier mobility degradation, threshold voltage shift, negative output conductance, and other aging problems, ultimately reducing transistor lifespan. The most commonly used method for analyzing self-heating-driven reliability issues is through 3D TCAD simulation, but this technique is computationally intensive and time-consuming. The semiconductor device community has started to focus on machine learning to address reliability and variability issues. Machine learning-assisted frameworks have been developed to address point defect prediction, device structural variation identification, and inverse design issues. This proposal aims to create a new machine learning framework to speed up the prediction of transistor failure/reliability issues caused by self-heating, with the same accuracy as 3-D TCAD simulation for Nanosheet FETs. The developed ML framework will undergo rigorous testing to optimize computational performance, accuracy, and robustness. |