Ok, here's a quick 3-minute Excel analysis of impact of team stats on run scoring done without writing any special macros. Unfortunately LINEST doesn't output statistical weights for multivariate analyses, but it gives the contribution of each variable along with the associated standard error. Here are the numbers for the steroid era, which I've given as 1994-2005:
AVG |
OBP |
SLG |
K% |
498.9127 |
1936.828 |
2815.681 |
-412.566 |
289.6629 |
239.1281 |
510.2548 |
741.7796 |
You'll note that for K rate, the error associated with the impact on run-scoring is considerably larger than the value itself. Thus, 0 impact on run scoring is firmly within the confidence interval, as well as a number of positive values. In other words, K rate has basically no impact on run scoring independent of other things it might influence like AVG and OBP.
Here are the numbers for 2007-present:
AVG |
OBP |
SLG |
K% |
-153.153 |
1836.141 |
2540.96 |
-480.846 |
93.44375 |
96.98936 |
212.8566 |
271.6744 |
Now the K rate is decidedly a negative factor. It's still extremely small relative to OBP and SLG, but it's clearly contributing an additional negative value in and of itself. So in the post-steroid era, Ks do matter, just not a lot.
During the steroid era, they didn't matter. It's a statistical fact. Teams that struck out more did not score less. No way to argue around that. It's just the way it was. You can outline any set of scenarios you want. It's inherently foolhardy to argue with empiricism. You can't win that argument.