Abstract Earthquake stress drops are inferred to be independent of source depth, contradicting linear scaling predictions for earthquakes as frictional stick‐slip instabilities that assume increasing fault normal stress due to overburden. Here, we examine the scaling between averaged stress drops and increasing normal stress for simulated earthquake sequences in continuum rate‐and‐state fault models. The models produce a weaker dependence of stress drop on normal stress than the linearity of simple friction, which can be well‐fit by a sublinear power‐law. This result is more prominent when the fault dimension is much larger than nucleation scales. In such cases, the averaged behavior of ruptures is dominated by rupture propagation conditions, reflecting more heterogeneous shear stress conditions. As natural faults can be considerably larger than the smallest earthquakes they host, such a weaker scaling between averaged rupture conditions and normal stress may partially explain the lack of an inferred depth‐dependence of earthquake stress drops.