Most Americans believe a better health insurance system will improve their public health but they are wrong. They already had/have it and what happened/is happening? Especially, during 2020, they fully enjoyed it and?