From the beginning of the American republic, most of the country’s thinkers and politicians have argued that our nation neither had nor needed a Left.
Historians of the so-called liberal consensus school argue that the United States has simply always enjoyed agreement on such matters as private property, individualism, popular sovereignty and natural rights. Others claim that the country never developed the leftist working class or peasantry seen in other nations, a claim often termed American exceptionalism. Still others say that the country doesn’t need a Left because it already believes in, or has even achieved, such goals as democracy and equality – a view held by Cold War liberals and neoconservatives.
But these are all false and misleading ways to understand America. The country has always needed, and typically has had, a powerful, independent, radical Left.