<p dir="ltr">Modern scientific and societal challenges increasingly demand statistical machine learning methods that are trustworthy. This dissertation contributes to the development of trustworthy AI by presenting three complementary methods that address key limitations in high-dimensional inference, dynamic modeling, and cost-sensitive prediction.</p><p dir="ltr">The first contribution introduces Neural Amortized Bayesian Conformal (NABC) Inference, an exact likelihood-free method designed to overcome the curse of dimensionality in traditional Approximate Bayesian Computation (ABC). By combining neural networks with the ABC method, NABC achieves substantial improvements in both computational and sample efficiency while preserving theoretical validity. The second work, Neural Conformal Inference for Jump Diffusion Processes (NCoin-JDP), reconceptualizes Bayesian inference for discontinuous stochastic processes. Leveraging the predictive reformulation and conformal methods, NCoin-JDP performs uncertainty-aware inference without relying on likelihood evaluations or discretization schemes. It offers a robust alternative to traditional MCMC techniques. </p><p dir="ltr">The third study develops a fast Cost-Constrained Regression framework for high-dimensional settings where covariate acquisition is limited by budget. This work formulates prediction under cost constraints as a nonconvex optimization problem, delivering accurate and interpretable models with practical resource awareness. </p><p dir="ltr">Together, these methods advance the state of the art in simulation-based inference, inference for jump diffusion processes, and budget-aware prediction, paving the way for reliable, efficient, and interpretable solutions in real-world decision-making.</p><p><br></p>