Home Politics Do election victories really give presidents a ‘mandate’?

Do election victories really give presidents a ‘mandate’?

by admin

Presidents often claim their election victories give them a mandate. How true is that in this hyperpolarized era — when President-elect Trump didn't win 50% of the vote?

Original Article

You may also like

Leave a Comment

sixteen + eighteen =