Hollywood is for-profit, is what Hollywood is. All the studios are owned by big, megacorporations that are the furthest thing from liberal you can possibly imagine.
The only way that Hollywood ever skews toward liberal is because part of what we make out of Hollywood involves writers, actors, directors, musicians, set designers, and photographers. In general, people like that are going to be more progressive, more open minded, a little more altruistic.
I have certain beliefs about how people should treat employees and how companies should be run, but I was really surprised though this process to learn that those beliefs are actually good business.
Hollywood has to appeal to the broadest audience, and when it comes to most social and economic issues, America is progressive. Because of that, the messages that are in Hollywood movies tend to be, for instance, pro-environment.
The easiest time to be funny is during a fairly serious situation. That way, you can break the ice. It's crazy, but even at funerals, people will get huge laughs.