US
United States

After Jimmy Carter Won the Presidency, Democrats Lost the South


Share:          



After Jimmy Carter Won the Presidency, Democrats Lost the South

Mr. Carter witnessed a shift from what had been a solidly Democratic South to one that Republicans, supported by white voters and particularly evangelicals, came to dominate.

Full Story HERE



MORE STORIES ABOUT US

Centered Image