Hollywood isn't in touch with our morals or our religion!
The majority of Americans (according to the Anti-Defamation League) thinks Hollywood is out of touch with the "real America". They feel Hollywood's morals is out of sync with the rest of America and they think the media and Hollywood is bringing religion down (doesn't mention if the religion Americans are talking about is Christianity or not or just religion in general as in: "Hollywood hates all religions in America").
What does everyone else think?
The majority of Americans (according to the Anti-Defamation League) thinks Hollywood is out of touch with the "real America". They feel Hollywood's morals is out of sync with the rest of America and they think the media and Hollywood is bringing religion down (doesn't mention if the religion Americans are talking about is Christianity or not or just religion in general as in: "Hollywood hates all religions in America").
What does everyone else think?
Comment