BIDEN TIMESDemocratsPOLITICAL ZOOSenateSowing HatredTrump Mania

Do Far-Left Democrats Want To Destroy America?


Yes

They want to destroy America.


No

They want to make America great.

With some of the outrageous ideas that far-left Democrats have pushed on the American people, a number of Americans believe that far-left Democrats secretly want to destroy America. What do you think?

Related Poll

Load More Polls Loading...No more polls.

Leave a comment

Your email address will not be published. Required fields are marked *