What does the Bible say about Christians going to the doctor?

On the surface, the question of whether Christians should go to doctors is somewhat ridiculous. Luke was a physician. Many hospitals were started by churches. There is nothing in the Bible that forbids a believer from seeing a physician. It is cultural circumstances—both in the ancient past and in the present—that cause some to stay away from doctors.

The past—medicine and spiritual warfare

The relationship between diseases and the demonic in ancient history is convoluted. Medicine has been a field of study in Egypt since at least 3000 BC and in the Middle East since 2000 BC. While professionals used a similar system of examination, diagnosis, treatment, and prognosis that doctors use today, they also took the influence of supernatural forces into consideration. In India, medical documents describe both surgeries and exorcisms.

In a world before the scientific method, before germs were discovered, it's easy to see why the ancients attributed so many medical conditions to supernatural causes. But it's confusing to note that Jesus did, too. In Mark 9:14-29, He healed a boy subject to episodes that sound like epilepsy. But instead of clarifying the situation and explaining it was really an issue with the brain, Jesus simply says, "You mute and deaf spirit, I command you, come out of him and never enter him again" (verse 25). In verse 29, He restates that the condition was caused by a spirit, although He healed many other medical conditions without attributing the illness to demonic influence.

Healing is strongly associated with religious leaders in the Bible. In the Mosaic Law, someone who believed they were cured of a skin condition was required to present himself before the priests for validation (Leviticus 14:1-9; Luke 17:14). Healing was one of the primary miracles that identified an authentic prophet of God (1 Kings 17:17-24). In the early church, James exhorted sick Christians to have the elders pray over them for healing (James 5:14). Over the next two-thousand years, the church and the medical fields had a volatile relationship, the church by turns being the lone source of medical aid, then fearful of scientific experimentation that could have resulted in more effective care.

The present—health and wealth There are several modern churches that discourage or forbid members from seeing doctors. Early Christian Scientists taught that illness is a mental illusion that could be cured with prayer and right thinking. Jehovah's Witnesses take the restrictions on eating blood (Genesis 9:4; Leviticus 17:12-14) so seriously that they refuse blood transfusions. Some independent churches reject medicine because of the connection between Greek medical terms and pagan gods and myths. Many Christians who go to faith-healing churches refuse to see doctors, thinking that it shows a lack of faith in God and His sovereignty over healing.

The "health and wealth" preachers feed into this belief, although sometimes inadvertently. It's become a common teaching that God wants everyone to be healthy, but we have to choose to be by internalizing His word and having enough faith. Both prosperity gospel teaching and Christian Science base their beliefs on the "New Thought" movement, which says we can control reality with our minds. While not all prosperity preachers discourage seeking medical attention, their emphasis on healing through prayer and faith gives ammunition to Christians who are already reluctant to see doctors.

It's a deceptively easy trap to fall into. Medical help is expensive. Injuries and deaths caused by malpractice are tragic enough to become scintillating topics of discussion. It's tempting to feed a fear of doctors, and trust in warm thoughts that Jesus will heal.

But it isn't biblical. No matter how proponents twist Scripture, the Bible does not say that in the church age every Christian can expect health. Nor does it say that we should not seek medical help from people whom God has endowed with the intelligence and ability to provide it. Many times Jesus said, "Your faith has made you well," but He never told us to avoid doctors.


Related Truth:

Healing - What does the Bible have to say?

What is the key to knowing the will of God?

What do the Jehovah's Witnesses believe? Who are the Jehovah's Witnesses?

What does it mean for Christians to be in the world but not of the world?

What does the Bible say about Christians getting insurance?


Return to:
Truth about Life Decisions
 

CompellingTruth
Recommends: