Yes
Democrats are ruining American morals.
No
America is getting better under Dems.
With the direction America is headed in, a number of people now believe that America is losing its morals. Do you think American morals are declining?
Welcome, Login to your account.
Welcome, Create your new account
A password will be e-mailed to you.