Black Panther

photo: Buena Vista
Buena Vista
It does feel like there is a reckoning happening in Hollywood. Yes, there are the #MeToo and #TimesUp movements that are making an emotional impact, but beyond those, there seems to finally be a move towards more inclusivity. Diversity in Hollywood has always been a challenge, but the tone-deafness reached a nadir in 2016, when every single one of the Academy Award acting nominees were white, prompting the #OscarsSoWhite fury. Since then, the Academy extended invitations to thousands of new members, most of them women and people of color. Beyond that, the widespread cultural demand for more diversity has been reflected in the movies that are being made, and how they are being made. There was no way Hollywood could continue down the path of being for and about white men any longer. Don’t get me wrong, white men still rule in Hollywood, both in front of and behind the camera (and where it really counts—in the executive offices), but the time’s, they ARE a’changin’.

Consider these facts:

• The top three domestic box office money makers in 2017 all featured women in the lead roles (Star Wars: The Last Jedi, Beauty and the Beast and Wonder Woman)

• The third highest-grossing movie of 2017 (domestically), Wonder Woman, was directed by a woman (Patty Jenkins)

Read moreBlack Panther