Image Source: www.imdb.comI googled "controversial films" and all that came up were the recently made documentaries over the government, oil crisis, war in Iraq, American diet, etc. I'm not so sure yet how I feel about these films that cover such controversial topics. These films are obviously independent because there is not a single Hollywood studio that would touch these projects with a ten foot pole. While I do love to watch documentaries and go "behind the scenes," I almost feel that these film producers and directors are going a little too far. The world knows that America's government and economy sucks, we know that 92% of Americans are too fat, and we know that "the earth is going to melt soon from the oil crisis and everything that comes with it." Do we really need more media out there shoving all of this garbage down our throat? I do keep up with these issues and yes, I definitely have my own opinions on them. However, I don't really care to watch a movie over someone else's opinion on these touchy, yet boring, subjects, whether I agree with them or not. Also, this is probably going to sound somewhat weird, but in ten years, granted the earth is still here, I don't really want my son or daughter to google 'earth' for a science project, and up pops a youtube video of Al Gore saying that we'll all be melting soon. I guess by the end of this post, I have formed somewhat of an opinion, and that is that documentary films are interesting and educating, but the current generation of doc filmmakers are taking the action a little too far.


