Posted by duece_duece on 4/29/2014 6:35:00 AM (view original):
This is very interesting to me, I've played this game for a couple years now and never once informed a recruit I had intentions of red shirting him. I always used the theory of if he's the lowest overall rated player at his "position" on my roster that he'll take the RS with no issues and it has always seemed to work. If I know ahead of time that say I am recruiting 2 "PF" and he is the lower rated player I know he will take the RS. I look at it as he believes he won't see much playing time because he's the "worst" player at his assigned position so a RS is accepted. Noted I've mainly only played DIII so maybe at the higher levels it may not work this way idk.
Also on a different note is Gil's comment about 3 SVs being worth more than a HV true, I've never read that anywhere, I believe the general consensus is a CV is equal to 2.5 HVs so by my math that would mean about 8 SV is worth more in "recruiting value" than a CV?
that's accurate, 8 SVs > 1 CV
on the other hand, when i played d2/d3, i redshirted religiously - but the goal was never to take the free redshirt (although its a great fallback plan). you get about 40% more out of a redshirted player for 25% more time, thats a pretty big benefit. a decent rule of thumb is to try to redshirt your best player. a better rule of thumb is to try to redshirt the most impactful player. generally, this is the leading scorer, but it can also be a traditional pg or occasionally a special big man. its hard to guess considering you have no idea who you would recruit, but you do know who your freshman are, and if you could keep 1 to build a team around, usually you can pick. thats the guy you try to redshirt via inform of redshirt.
going back to that 40% vs 25%, thats a big difference, but not huge - but it varies WILDLY by player. let me give an example. say the average player you recruit gives you 10 units of contribution to your team (this is arbitrary but it works out the same with any number), thats 2.5 per year, and if you redshirt this player, thats 14/5 or 2.8 per year (10 units * 40% over 5 years). decently better, but not huge - .3 increase over 2.5 is a little better than 10% better average contribution for the player.
using your approach, lets redshirt a guy who is probably below average because hes the lowest rated - say he is going to contribute 8 units a year. now instead of 2, he is contributing 11.2/5 or 2.2 a year. thats a third less benefit increase at .2 instead of .3 per year (keep in mind your team is the sum of the players contribution, so its the absolute increase in contribution, not relative, meaning not the % increase, that really matters).
using the approach of redshirting an all star, who can easily contribute (especially in a leading scorer role for multiple seasons) 50% more than an average player - easily - thats 15 units of contribution, or 3.75 per season, but if you redshirt, now you get 22.5 total, or 4.5 per season. that .75 difference is 2.5 times more than you get redshirting the average player, and almost 4 times more value than you get from redshirting the worst player. thats a massive difference! of course, my 40% figure can be questioned, but i think its pretty reasonable.
what is kind of missing here is - redshirting the good player is 4 times more value than redshirting the worst player is, over what... over not redshirting anyone? i had included that earlier but kind of confused myself so i just took it out, i'm pretty sure i included it wrong anyway so maybe this is a little flawed logic but if you only consider the set of situations where you redshirt someone, i think its fine. in general a player who is a little worse who gets redshirted could get replaced by an average player which is sort of a wash anyway, so i think the figure is more than 4x is you were to look at it that way, but im not sure exactly how that would sort itself out... id have to at least drop this paragraph format and use a good old fashioned pen and paper, maybe later...
4/29/2014 1:41 PM (edited)