The media's job used to be to responsibly and factually report the news, as it happened, after-it happened. Seems now, they for some reason feel their job is to create and/or influence the news, and make their (often made-up for the job) opinions the "facts". I don't really understand how that's better for business, other than the more ridiculous something they say is, the more people are inclined to read it, in today's distorted, mindless society. It's simply sad that so many people take remarks intended to be crazy enough to produce views/clicks/whatever as true and factual. The dumbing-down of America by the media truly scares me for the future of this once great (and still better than most) Country.