I'm sure we've all been taught things that aren't true. Hell, I'm not sure if I should eat eggs or not. Their value in your diet seems to change every couple of years.
No one is suggesting a religious education. I'm suggesting that opening up that option by mentioning it wouldn't be the end of the world as some seem to believe.
FWIW, I don't care one way or the other. I'm just curious why so many people do.