Photo by Samuel Branch on Unsplash
Yes
They want to destroy America.
No
They want to make America great.
With some of the outrageous ideas that far-left Democrats have pushed on the American people, a number of Americans believe that far-left Democrats secretly want to destroy America. What do you think?