Ty Cobb's season batting average is the same as Shoeless Joe Jackson's at the beginning of a late season doubleheader.
Assume both players have had hundreds of at bats.
Cobb went 7 for 8 on the day (0.875) while Jackson's season average turned out to be higher than that of Cobb.
How is this possible?
Of course one way to explain it woulr be to say Jackson went 8 for 8 that day. That would be no paradox.
But I assume what is intended is that Jackson had a non-exceptional day the day in question, or at least not as good as Cobb's showing that day.
The explanation would be that Cobb had many more games that season, than Jackson. All Jackson needed to do was to do somewhat better than his season average and it could bring that average up better than Cobb's having a spectacular day.
Say previously Cobb had 300 at-bats and had 100 hits, while Jackson had 30 at-bats and 10 hits. Each was batting .333.
Now Cobb had 308 at-bats and 107 hits, batting .347 for the season. If Jackson went "only" 4 for 8 that day, he'd be batting 14/38 = .368. I believe this is Simpson's paradox.
|
Posted by Charlie
on 2024-03-16 09:15:54 |