Hollywood is for-profit, is what Hollywood is. All the studios are owned by big, megacorporations that are the furthest thing from liberal you can possibly imagine.
You could feel America starting to ease up a little bit on racism, against blacks in certain pockets, and then suddenly The Cosby Show bubbled up and it was the right time for it.
In general foreign invested companies who come to America to start a company, to open a manufacturing business or whatnot, they actually provide much higher wages than American companies.
Hollywood has to appeal to the broadest audience, and when it comes to most social and economic issues, America is progressive. Because of that, the messages that are in Hollywood movies tend to be, for instance, pro-environment.
I think American culture had just become so disengaged from the process of government, and we'd been so fuzzed out by our pop culture around us, that I don't think people really saw this guy for what he was.
Obamacare is a private mandate that will drive billions to the insurance industry, much like the auto insurance mandate. Hardly socialism. In fact, it was a Republican plan to begin with.