How about the good things that Hollywood does. I think there are good things. First, some of the movie/documentaries we see offer a different perspective. Some movies make us think about things that we may otherwise may not have known about. Movies about war show us how things are or were for soldiers and others involved. They might not be entirely accurate, but Saving Private Ryan definitely gave me perspective on how it was for soldiers in WWII. I could not get that without having gone to that movie. Even the movie, Juno, which was up for an Academy Award this year, gives a different perspective. This movie was about a pregnant teenager, which is a touchy subject in our culture. However, the movie gave a different slant to the teen pregnancy situation. It wasn't glorified, yet it wasn't vilified either (you'll have to see it).
Even if you do not believe the whole "global warming" thing (and right now, in Minnesota, it is hard to jump on board with that), Al Gore's documentary, An Inconvenient Truth, brings up some interesting and important issues. At the least, it should make people think about what we're doing to the environment, and possibly get us to care more about it, which is the right thing to do.
Movies, like books, let you glimpse how others live, and I think that's a good thing.