Kate Winslet: Hollywood is working toward making women feel safe
Kate Winslet thinks Hollywood is creating a space for younger women to be “safe” at work which wasn’t present before in the wake of the #MeToo movement and the sentencing of Harvey Weinstein.