Do you think that the stream of sex scandals coming from Hollywood is over for the time being or do you think things are only just getting started? How do you feel about what's happened? Do you think all of them are facing ruined careers? I'm mixed on them. I think Harvey Weinstein is definitely over. Kevin Spacey? I'm thinking probably not. Do you think that this will continue to affect Hollywood or do you think the attention has been effectively switched over to the politicians?