Hollywood is for-profit, is what Hollywood is. All the studios are owned by big, megacorporations that are the furthest thing from liberal you can possibly imagine.
You could feel America starting to ease up a little bit on racism, against blacks in certain pockets, and then suddenly The Cosby Show bubbled up and it was the right time for it.
I have certain beliefs about how people should treat employees and how companies should be run, but I was really surprised though this process to learn that those beliefs are actually good business.
Obamacare is a private mandate that will drive billions to the insurance industry, much like the auto insurance mandate. Hardly socialism. In fact, it was a Republican plan to begin with.
In general foreign invested companies who come to America to start a company, to open a manufacturing business or whatnot, they actually provide much higher wages than American companies.
The only way that Hollywood ever skews toward liberal is because part of what we make out of Hollywood involves writers, actors, directors, musicians, set designers, and photographers. In general, people like that are going to be more progressive, more open minded, a little more altruistic.
Everyone is sort of in their own little area counting lines and no one talks when film's not rolling. There's constantly actors coming to me back behind the monitor screaming at me, "Why did my line count drop?" It's a nasty tense environment.