What role, if any, should the U.S. Government play in health care coverage for Americans?

What role, if any, should the U.S. Government play in health care coverage for Americans? Should the government’s role in terms of health insurance be different for those with jobs and for those without jobs?