
Hollywood has always been political. They consider it their right and duty to tell us what is politically good and right.

Hollywood has always been political. They consider it their right and duty to tell us what is politically good and right.
Change Background:































