Monday, February 25, 2008

Post-Oscar Discussion

So, it's the day after the Academy Awards and I did pretty good in my predictions. I got four out of five. The only one I missed was best actress, which turned out to be Marion Cotillard from France. This was an upset - there were no predictions that I read that Marion would be picked best actress. What's interesting is that all four of the top acting honors went to non-Americans. I wonder if that has ever happened before.

Anyway, to tie this whole Oscar thing into an issue, let's take a look at how the Oscars, or Hollywood in general, influence the public. Does art imitate life or life imitate art? I've read some interesting articles about how Hollywood elites have more radical social views and the stories they present influence the way we all view the world. Hollywood has the ability to disseminate information around the world, and this can leave lasting impressions on society and public opinion. There was an article on the National Public Radio website: http://www.npr.org/templates/story/story.php?storyId=6625002
that said, "Hollywood is an entertainment-industry juggernaut; its success in exporting movies, TV shows and music that have vast, global appeal is unparalleled. At the same time, anti-American sentiment is rising overseas, most notably in the Middle East, Latin America and Europe. A panel of authors, educators, and filmmakers gathered on December 13 [2006] to explore to what extent, if at all, these two phenomena are connected. They debated the proposition, "Hollywood has fueled anti-Americanism abroad."

A couple articles also stated that this was not always the case. When movie studio heads were removed as the final decision-makers for which movies were released, etc. (this was in the 60s I think) was when movies started to portray violence, sex, drugs, and anti-Americanism. NPR stated that "in the past, movies provided a positive American image to the world." Their example was John Wayne movies where the good guys had integrity, etc. and always beat the bad guys.

So, is it good that Hollywood brings some issues to light that the public may otherwise not be aware of? Or is this a bad thing? And, if they portray these issues, is it anti-American? Do they overly simplify or glorify or exaggerate American life?

More on this issue later.

No comments: