Photo by Samuel Branch on Unsplash
Yes
America has gotten worse.
No
Biden made America better.
A large number of middle class Americans are now claiming that President Biden hasn’t done anything since coming into office and actually made America worse. Do you agree?