Abstract We introduce a versatile generative learning framework that integrates probabilistic diffusion models, observational data, and domain knowledge for stochastic modeling of flow in porous media. The framework begins by pretraining an unconditional diffusion model to approximate the joint distribution of subsurface parameters and state variables, effectively capturing prior information of dynamical systems. By leveraging Bayesian conditional sampling, the model flexibly incorporates specific constraints and adapts to multiple modeling tasks without retraining or fine‐tuning. Furthermore, we devise a training‐free knowledge alignment strategy that embeds domain‐specific knowledge into the sampling process to generate spatiotemporal fields more compatible with physical principles. Extensive evaluations on diverse subsurface flow problems demonstrate that a single pretrained diffusion model, equipped with optimized generative paths, delivers superior performance in unconditional generation, forward prediction, uncertainty quantification, and inverse modeling with sparse and noisy data. These findings underscore the potential of knowledge‐aligned generative learning to advance subsurface flow modeling research.