Do you agree with this?
It is not a stretch to say that the ascent of the South has been the single most important development in US politics in the past 50 years. Beginning in the mid-1960s, with fitting irony, almost 100 years after the collapse of the Old Confederacy in the Civil War, the South has steadily grown to a remarkable dominance over America’s politics.
And are they right that Dixie’s days are over?