health insurance in the United States


English Wikipedia - The Free EncyclopediaDownload this dictionary
Health insurance in the United States
In the United States, health insurance is any program that helps pay for medical expenses, whether through privately purchased insurancesocial insurance or a social welfare program funded by the government. Synonyms for this usage include "health coverage," "health care coverage" and "health benefits."

See more at Wikipedia.org...


© This article uses material from Wikipedia® and is licensed under the GNU Free Documentation License and under the Creative Commons Attribution-ShareAlike License