A doctor is a medical professional trained and authorized to practice medicine to maintain and improve human health. They diagnose and treat illnesses, injuries, and other physical and mental health conditions. Doctors play a crucial role in society, not only by providing direct patient care but also by contributing to public health initiatives and advancements in medical knowledge.