RedRaiderLaw, no one personally attacked you or called you a name. Two people offered the opinion your conclusions were baseless, not that you personally were stupid.
My understanding of hitter fatigue is like so. If you play a .300 hitter at 80%, he's going to become a .240 hitter (or something close to it), and you will see a similar impact on home run rate, walk rate, etc. This is a difference that will take a long time to reveal itself, as that's only a difference of six hits in 100 at bats with all else equal. There's the famous Bull Durham monologue about how one hit a week over the course of a season is the difference between a .250 hitter and a .300 hitter.
It wouldn't be that strange an occurrence for a .240 hitter to outperform a .300 hitter over 100 at bats. Just looking at a team I have now, through 34 games, .390 RL AVG 1897 Fred Clarke is hitting .210, while .371 RL AVG 1885 Roger Connor is hitting .376.
As others have said though, the big problem with playing hitters fatigued is fielding. There's a similar percent multiplier applied to fielding percentage, except instead of a six percent decrease from .300 to .240, you're looking at a nearly 20 percent decrease from, say, ,98 to 0.784. Range is impacted similarly.
I could be wrong about the exact multipliers at play, in fact I think it's likely more complicated than simply a factor, as 0% players do occasionally still get hits, but I am quite confident in saying that hitter fatigue does not mean nothing. But if you believe it does mean nothing, I would suggest entering a team with 4000 PAs and no AAA backup and let us know how it goes.