health care

  

Definition

The act of taking preventative or necessary medical procedures to improve a person's well-being. This may be done with surgery, the administering of medicine, or other alterations in a person's lifestyle. These services are typically offered through a health care system made up of hospitals and physicians.

Related Videos




http://www.businessdictionary.com/definition/health-care.html

Have a question about health care? Ask for help in the
Community
advertise here

Browse by Letter: # A B C D E F G H I J K L M N O P Q R S T U V W X Y Z