Starting in the 1960s, more and more Hollywood films depicted an increasingly violent and alienated American society quickly losing its mind. It’s hard not to see their relevance to our times.