Yes, most Americans have health insurance through work. But not all employers provide health insurance or offer plans to all their workers. When they do, cost and quality vary widely, making Sen. John Thune's statement an oversimplification.
“A lot of times, health care comes with a job.” Source: Politifacts.com
Read more