I debated on whether to put this here or in Pop Culture - but I see this as more of a social woe. (Mods, if you feel otherwise, just do that voodoo that you do so well.)
Anyway, this was prompted by a post in the Politics forum that mentioned Hollywood's depiction of guns and violence. Now, for this argument, I'm going to use the generic term 'artist', and by 'artist' I mean those involved in theatre, film, tv, music, visual arts, dance, whatever.
My overriding theory about art (theatre in particular) has been that art is a "mirror to society". The art of a particular place and time reflects the problems, issues, morals, etc of that place and time. So, if movies are really violent, that's because the real world is violent, and the movies are just reflecting that.
Is it really fair to the artist to expect them to censor themselves and only produce happy, light, pretty entertainment in the face of this modern world? Because, frankly, there's plenty of fluff anywhere you want to look. Sitcoms, romantic comedies, the glittery crap up on Broadway these days.
I don't think you can blame society's problems on TV or film or anything else - EXCEPT the overall culture of that time and place. In overly violent areas, there are much bigger forces at work than rap music and Reservoir Dogs.
Several years ago, I got into an argument with a co-worker because she thought that all TV/film/art should be censored because of the kids. She said, "What am I supposed to do if they see two boys kiss on TV? rantrantrant!" My response, "Um, YOU'RE the parent, that's up to you, not up to me or anyone else."
This is kind of long and rambly, but what are your thoughts? Is Hollywood to blame?
Anyway, this was prompted by a post in the Politics forum that mentioned Hollywood's depiction of guns and violence. Now, for this argument, I'm going to use the generic term 'artist', and by 'artist' I mean those involved in theatre, film, tv, music, visual arts, dance, whatever.
My overriding theory about art (theatre in particular) has been that art is a "mirror to society". The art of a particular place and time reflects the problems, issues, morals, etc of that place and time. So, if movies are really violent, that's because the real world is violent, and the movies are just reflecting that.
Is it really fair to the artist to expect them to censor themselves and only produce happy, light, pretty entertainment in the face of this modern world? Because, frankly, there's plenty of fluff anywhere you want to look. Sitcoms, romantic comedies, the glittery crap up on Broadway these days.
I don't think you can blame society's problems on TV or film or anything else - EXCEPT the overall culture of that time and place. In overly violent areas, there are much bigger forces at work than rap music and Reservoir Dogs.
Several years ago, I got into an argument with a co-worker because she thought that all TV/film/art should be censored because of the kids. She said, "What am I supposed to do if they see two boys kiss on TV? rantrantrant!" My response, "Um, YOU'RE the parent, that's up to you, not up to me or anyone else."
This is kind of long and rambly, but what are your thoughts? Is Hollywood to blame?
Comment