Study Says Hollywood Still Has ‘Permissive Climate’ of Sexual Harassment, Racism
Left-wing Hollywood elites like to lecture Americans on the importance of diversity and inclusion. But a new study has found that the entertainment industry is failing to live up to the standards it wants to impose on others.
