So much of Hollywood is this kind of overly machismo, nonsensical view of masculinity, which I just don't find honest. I think it's this idea of - you know, we're told, well, 'Be a man, be a man.' But what does that mean, exactly? Does that mean you can't carry yourself with any fear? That you can't acknowledge that you're scared?