Yes

America has gotten worse.


No

Biden made America better.

A large number of middle class Americans are now claiming that President Biden hasn’t done anything since coming into office and actually made America worse. Do you agree?

Related Poll

Load More Polls Loading...No more polls.

Leave a comment

Your email address will not be published. Required fields are marked *