Data-driven surrogate models offer fast, inexpensive approximations to complex numerical and experimental systems but typically lack uncertainty quantification, limiting their reliability in safety-critical applications. While Bayesian methods provide uncertainty estimates, they offer no statistical guarantees and struggle with high-dimensional spatio-temporal problems due to computational costs and dependence on prior specification. We present a conformal prediction (CP) framework that provides statistically guaranteed marginal coverage for surrogate models in a model-agnostic manner with near-zero computational cost. Our approach handles high-dimensional spatio-temporal outputs by performing cell-wise calibration while preserving the tensorial structure of predictions. Through extensive empirical evaluation across diverse applications—including partial differential equations, magnetohydrodynamics, weather forecasting, and fusion diagnostics—we demonstrate that CP achieves empirical coverage with valid error bars regardless of model architecture (MLP, U-Net, FNO, ViT, GNN), training regime, or output dimensionality (spanning 32 to over 20 million dimensions). We evaluate three nonconformity scores (conformalised quantile regression, absolute error residual, and standard deviation) for both deterministic and probabilistic models, showing that guaranteed coverage holds even for out-of-distribution predictions where models are deployed on physics regimes different from their training data. Calibration requires only seconds to minutes on standard hardware, with prediction set construction incurring negligible computational overhead. The framework enables rigorous validation of pre-trained surrogate models for downstream applications without retraining, providing actionable uncertainty quantification for decision-making in scientific domains. While CP provides marginal rather than conditional coverage and assumes exchangeability between calibration and test data—limitations we demonstrate empirically through sensitivity analyses—our method circumvents the curse of dimensionality inherent in traditional uncertainty quantification approaches, offering a practical tool for the trustworthy deployment of machine learning in the physical sciences.