Yes

Democrats are ruining American morals.


No

America is getting better under Dems.

With the direction America is headed in, a number of people now believe that America is losing its morals. Do you think American morals are declining?

Related Poll

Load More Polls Loading...No more polls.

Leave a comment

Your email address will not be published. Required fields are marked *