Yes

Democrats are ruining American morals.


No

America is getting better under Dems.

With the direction America is headed in, a number of people now believe that America is losing its morals. Do you think American morals are declining?