The only way that Hollywood ever skews toward liberal is because part of what we make out of Hollywood involves writers, actors, directors, musicians, set designers, and photographers. In general, people like that are going to be more progressive, more open minded, a little more altruistic.
I have certain beliefs about how people should treat employees and how companies should be run, but I was really surprised though this process to learn that those beliefs are actually good business.
You could feel America starting to ease up a little bit on racism, against blacks in certain pockets, and then suddenly The Cosby Show bubbled up and it was the right time for it.
The crush of lobbyists on Washington and purchase of the media by corporations has created a big business-run government and a worthless press leaving Americans screwed and ill-informed.
Hollywood has to appeal to the broadest audience, and when it comes to most social and economic issues, America is progressive. Because of that, the messages that are in Hollywood movies tend to be, for instance, pro-environment.