Sign up for email updates!



Share this!

You may have seen recent headlines warning that the U.S. could face an “even more contagious COVID-19 subvariant.”

“On the surface, U.S. Covid-19 metrics continue to show improvement, with cases, hospitalizations, and deaths down dramatically from their peaks just two months ago. But some experts are growing increasingly nervous that the positive trends may be slowing down or even headed for reversal.

“The country needs to prepare for another spike in cases, they’re warning… Europe has seen a surge of cases in recent weeks, and the situation there has typically foreshadowed that in the United States.”[i]

Given the ongoing fact that we’ve never seen a virus behave the way this one is, SkyWatch TV is publishing this online series detailing the many ways in which you can naturally bolster your—and your family’s—immunity against virus and disease.

When we begin to search for ways to foster well-being through natural approaches, one of the first challenges we face is having family, peers, or others offer negative opinions about this journey. For many, leaning on natural elements seems too simple to hold the keys to solving complex medical concerns. However, we wish to challenge this thinking: Why shouldn’t our bodies benefit most from a gentle, natural approach? Why should harsh pharmaceuticals and chemicals (which often have detrimental side effects) always be our first resort? Further, why is the success of natural remedies never touted in the news or on social media? This indicates that our society has migrated toward buying quick fixes for many ailments and unquestioningly submitting to the authority of those prescribing these solutions, rather than investing in their overall well-being across the span of their lifetimes—even when they aren’t experiencing discomfort or illness.

For example, founder of CHT Wellness’ signature program, Wholistic Blueprint, Cynthia Thurlow, recognizes that many would rather remedy their obesity problems with a pill than with a fitness-oriented approach. “It makes me want to cry,” she says, “when my female patients would prefer I write them a prescription than work on changing their diet, [engage in] more exercise, [or take on] other lifestyle changes.”[iii] Yet, shortcuts such as pharmaceuticals are taken every day, with side effects disregarded and overall wellness shoved to a back burner—and too many follow without question, at the expense of their health, the professionals who facilitate these practices.

Unfortunately, many within our modern society are conditioned to subject themselves to authority. This becomes cyclical, in that the more we outsource the care of our own health, the more we must lean on those whom we perceive to be experts on our well-being. The image portrayed to a consumer or patient, however, doesn’t always carry the promised merit. For example, actors or models hired to pose in lab coats are often the ones we place our trust in as we buy products labeled with claims of bettering our well-being. As a result, we embrace foods, supplements, and even pharmaceuticals based on our impression (and thus, trust) of such characters who, in real life, may have no credibility, training, or authority to make such recommendations (more on this in a bit).

When it comes to those who have had medical training, such as healthcare workers, doctors, and surgeons, we often obey them without question simply because we perceive them to be “in the know.” Consider the number of times you may have heard an acquaintance speak of taking a medication that either manifested a dangerous side effect or counteracted another one he or she was taking. The results of these side effects and interactions can range from inconvenient to potentially fatal. Even when a doctor has thoroughly researched and vetted a medication, the product may have hidden secondary effects that haven’t yet been discovered or reported.

In any such instance, the reason for taking the drug always seems to be the same: “The doctor told me to.”

Dangerously Conditioned

Our society is so dangerously inclined to blindly follow the orders of attending physicians that there’s a record of many instances when patients or even subordinate healthcare professionals go against their own instinct or training to follow the superiors’ orders.

It is unfortunate that, as we go about our daily lives giving little or no thought to the preventative care of our bodies, we elevate medical professionals so highly: We expect them to solve complex medical issues that are often the product of our own self-neglect. We perceive them to be capable of healing us should the need arise; in this way, we avoid having to take responsibility for our own health. We imagine physicians as some type of “safety net,” with unlimited resources and vast knowledge that will surely bail us out should we find ourselves ill. Not only is this false thinking, but it puts us in the unfortunate position of placing godlike expectations upon mere human beings. In response, we surrender authority to them.

An extreme example of this submission-to-authority dynamic can be found in a highly controversial experiment conducted by Stanley Milgram at Yale University in the early 1960s (we’ll elaborate on the relevance between this experiment and our current topic in the upcoming pages). In this study, volunteers were recruited to participate in what they were told was a learning experiment. In truth, the study focused on individuals’ propensity to obey authority figures and the researchers’ desire to see just how far a participant would go to remain compliant. (The ethics of this study were subsequently called into question, for good reason, but it nonetheless illustrates a willingness to obey authority figures even if it goes against one’s inclination.)

The experiment worked like this:

Volunteers were assigned the role of “teachers,” and believed the “students” were participants in the study as well. The represented premise for this research (not the real one) was to test memory retention when it’s reinforced by punishment. This was (supposedly) assured by an electric shock applied when volunteers asked to recall information gave incorrect answers. However, these “students” were actors pretending to convulse in pain when they were “shocked” for giving a wrong answer. The unsuspecting “teachers” were instructed to relay two words to their counterparts, who supposedly would attempt to remember the words in order to avoid the punishment. When the time came for recall, if the “students” didn’t remember the elements of the memory test, the “teachers” were to administer the penalty, a consequence technique said to increase learning, thus (supposedly) making this step a vital part of the process.

This discipline was administered by requiring participants to press a series of buttons they were told delivered voltage. The severity of surges was labeled on the mechanism in these increments: “Slight Shock,” “Very Strong Shock,” “Danger: Severe Shock,” and even “XXX.”[iv] As the number of incorrect answers given by the “students” (recall that the volunteers were unaware that they were dealing with actors) increased, the volunteers were instructed to administer shocks in mounting levels of intensity. As the power of the shocks escalated, those receiving the punishment would complain that the pain was becoming more intense. Eventually, the actors would be screaming, even stating the desire for the experiment to end and saying that they no longer wanted to participate. At times, some would refuse to answer the questions, supposedly afraid they would give the wrong response and be shocked again. However, the conductors (individuals in charge) instructed volunteers to treat non-responses as wrong answers; this caused volunteers to have to administer severe shocks to people who did not respond.

This experiment may seem cruel and manipulative. Certainly, many of Milgram’s critics thought so, and his practices weren’t condoned during subsequent scrutiny. To be truthful, his career never was the same when his methods were revealed. But we can still gain valuable knowledge from the research (although we are saddened by the trauma it surely caused his volunteers). Usually, those asked to predict what they would do in such a situation would assert their refusal to initiate or continue administering shocks as the intensity escalated, yet surprisingly, two-thirds of Milgram’s participants remained obedient all the way through the experiment: Two out of three continued to send electrical surges past the time when the “student” asked for the procedure to stop, beyond the point of becoming completely unresponsive and even to the place that he or she was directed to administer the maximum voltage, which was said to equal 450 volts.[v]

Milgram noted visual signs of inner conflict among participants, such as “sweating, trembling, stuttering, biting their lips, and so on,” but despite this, they still yielded authority to those they perceived to be the “experts” or “in charge.”[vi] The experimenter found similar results when conducting the same study but with one variation: At the beginning of the procedure, the individual receiving the voltage mentioned having a heart condition. Even with this, the percentage of people who followed the order to deal out shocks was 62.5 percent.[vii] Also interesting is that when the experimenter left the room and instead gave orders over a telephone, the obedience rate dropped by 20.5 percent.[viii] This indicates that the physical presence of the authoritative figure while giving orders has an impact on the subordinate’s compliance.

To further identify factors that may have contributed to obedience, Milgram relocated to a shabby office and conducted the experiments again there to see if the willingness to follow orders had anything to do with the participants’ esteem for Yale University. The obedience rate reduced to 47.5 percent.[ix] This means that in addition to proximity of the actual authority figure, the participants’ opinion of the professional setting lent credibility to the commands.

Through this effort, Milgram was trying to explain why in circumstances such as the Holocaust or the My Lai Massacre in Vietnam, when soldiers killed hundreds of civilians, those following orders would be compelled to obey beyond what their morality would ordinarily allow. As stated earlier, Milgram’s methods were criticized by his comrades, who said that the experiment had no “external validity,—[in other words, the results could not be] generalized to other situations and other people” as it pertained to the willingness to inflict pain on another human being, merely to follow orders.[x]

However, what this experiment did show is brought to light by another critic of Milgram: “Participants in an experiment are concerned with being good subjects and acting in a manner that they perceive is expected of them.”[xi] They were told that their “students” would receive no permanent physical damage by trusted influences such as the experimenter, who, during some of the sessions, appeared as a representative of the esteemed Yale University.


Certainly, the controversial nature of this experiment should be considered before assuming that what we can be learn from it can be flawlessly leaned upon. We’re not asserting that the methodology was without fault. Yet, there are elements worth correlating to our psyche as patients. For one thing, participants in this process maintained between a 47.5 to 62.5 obedience rate, despite the fact that they believed themselves to be creating discomfort on their “student.” (Milgram obtained other percentages of compliance/disobedience by subsequently varying the procedure significantly from the one that has been discussed ere.) The fact that individuals were willing to follow such instructions likely stemmed from three primary elements:

  1. They didn’t believe they were inflicting permanent, physical damage to the “learner.
  2. A figure of authority told them their actions were necessary for the “memory study,” and thus contributed to the general betterment of society via knowledge gleaned.
  3. The participants’ compliance was a reflection of their esteem of the facility or qualifications of the individuals hosting the procedure.

At this point you may be wondering how a psychological test nearly six decades old and conducted by a man with questionable methods could possibly be relevant in today’s world. As we’ve admitted, there are flaws in the concept of leaning completely on this experiment to make our argument. Yet, this study does demonstrate some timeless truths:

  • People often obey instructions—even those that go against their own instincts—when someone in authority tells them they are necessary or good.
  • The credibility of the facility represented adds motivation to the willingness to submit to authority.
  • People frequently perceive a beneficial outcome to be worth the endurance of pain.

When we apply these truths to current attitudes toward healthcare, we can easily see that a large percentage of the populace readily farms out the responsibility for their healthcare to people who seem professional, appear knowledgeable, and work in facilities we have confidence in. Additionally, we tend to trust specialists who use the latest equipment, who have access to the largest and most up-to-date inventory of pharmaceuticals, and who seem to speak in language that demonstrates a solid working knowledge of the subject matter.



Often, those seeking medical care are overwhelmed by their symptoms and assume that professionals know more about their body than they do. Regardless of how society came to be this way, the vast majority of Americans completely count on our doctors to make our healthcare decisions. As surprising as it may seem, the mentality that Milgram’s study unearthed still thrives in the undercurrent of psychology as it pertains to medical care.

It is an unfortunate fact that we live in a “doctor knows best” setting, where a large percentage of those living in our communities are content to farm out healthcare decisions to those they perceive to have both unlimited knowledge and the power to save them. Consider the trend of leaving specialized pursuits up to the “experts.” This is how we purchase our food, obtain home and car repairs, file our taxes, secure legal direction—it’s how we do everything. It would seem that the details of taking proactive care of our health has become an inconvenience we hardly afford the time for. In other words, we delegate the maintenance or repair of everything—including our own bodies—to those we perceive as being in the know.

Yet, every human being is capable of error., Dr. Vinita Parkash, a professor at Yale School of Medicine, explains that as discovery within the medical industry grows more complex, the capacity for errors increases.[xii] She confesses in a transparent and reflective article:

My biggest mistake (that I know of) happened early in my practice… In this case, I missed the cancer cells on my 42-year-old patient, causing her cancer diagnosis to be delayed. She eventually died of cancer.[xiii]

Dr. Parkash relays that pressure on doctors to be considered quality physicians by their patients, to avoid lawsuit and disciplinary action, and even to maintain esteem amongst colleagues relies on their ability to always have the correct answers to every question.[xiv] Certainly, even the best in their fields are incapable of putting together such a track record. By her own confession, Dr. Parkash says, the best physician is one who “makes mistakes, acknowledges them, and learns.”[xv]

However, this isn’t how we perceive medical professionals at all, nor would we indulge them the luxury of being so transparent—so human. To be certain, someone who openly acknowledges this level of acceptance of shortcomings would be targeted for lawsuits or so severely mistrusted that he or she would have no client base. Thus, they must perpetuate the idea that they have earned a place on the “all-knowing” pedestal we seem so ready to position them on. It’s easy, then, to see Milgram’s observations alive and well today. The image of a trustworthy physician, then, becomes all too similar to that of a comic-book hero: a superhuman, all-knowing, never-wrong champion who can save us from ourselves. And when we find someone to fill that role, we readily relinquish authority over our health to this professional.

We see this every day for those who are pursuing healing. When a medical issue surfaces, we look for an “expert” to fix it. They’re easy to find: Billboards advertise attractive professionals whose very countenances ooze capability. The “shelves” of online stores overflow with “healthy” foods and supplements that are portrayed to be miracle cures for nearly anything that ails us, and fad diets allure overweight people into the notion that a monthly membership will buy the slimmer physique they desire. Added to the mix are advertisements for pharmaceuticals, medical facilities, and surgical procedures that portray thriving, happy, eye-catching people who smile reassuringly, telling consumers they “got their life back” after buying in to the symptom-covering fix-all that we call our modern medical industry.

Such media and propaganda is damaging. It distances us from our responsibility to take an active role in managing our health. Additionally, it feeds the common train of thought that if something happens to go wrong with the body, it’s easily fixed. After all, we have experts to take care of that! In the meantime, symptoms are masked, and the physician who states there is no reason for further concern becomes the authority who frees patients to resume poor health habits. And why wouldn’t these “experts” have such clout in the eyes of the layperson? They are professionals—precisely the type of person, Milgram demonstrated, whom people are inclined to obey, even when it goes against their own instincts or even moral code.

I (Joe Horn) will give you a personal example. Just before I went in for my colon surgery, my wife asked the doctor if there could be any advantage to my taking probiotics. She had begun to wonder (as I have told previously in Timebomb) if there was a link between my medical problems and diet. The physician answered that “no science anywhere…illustrates that probiotics do anything.” He went so far as to explain that many people have their colons completely removed and live perfectly normal lives.

Then, he pulled out a brochure. I’ll never forget it.

The colorful piece of media was produced to assure the patient anticipating a full colectomy (complete removal of the bowel) of the hope for an ordinary, pain-free and disease-free future. However, what really stands out in my memory are the models of post-surgical patients portrayed in the publication. One was an extremely attractive woman on the beach wearing a bikini and a colostomy bag. Another model was dressed like some sort of kickboxer or cage-fighter. Not only were these models representing patients as happily postoperational, but they were extremely physically fit, even athletic. As I stated in Timebomb, I was so tired of being unhealthy for such a long time that I succumbed to the propaganda.

My wife, however, wasn’t so quick to embrace the message. Her instinct was talking to her, and she was listening. However, because of the acuteness of my symptoms, the exhaustion brought on by years of chasing healing and feeling as though we had run out of options, and the influence of this “knowledgeable professional,” we went through with the procedure.



Later, I found that literally thousands of studies link healthy gut microbiota to overall and specific avenues of improved health. The doctor’s statement simply wasn’t true. Now, please know that I hold no ill toward this physician. He is reputed to be one of the best in his practicing state, highly acknowledged and regarded by his peers, and decorated with all the symbols of success that come with performing good practice. As I stated in Timebomb, if a person were to opt for a surgical removal of part of his or her body, this was definitely the right guy to see.

However, this doctor’s dismissal of my wife’s intuition resulted in the removal of part of my colon. I forfeited—forever—the opportunity to find a natural way to heal from that illness. This decision was influenced by his authority and the trust we placed in him, the facility, and even the innovations that lent esteem to his advice.

Whether you accept the premise that Milgram’s study is worth considering, it remains a fact that most people hand over their healthcare to providers. Whether it’s the allure of a quick fix promised by a model on a billboard, the convenience of allowing someone else to manage your health, or even the perception of lack of individual empowerment, the sad truth stands that, for some reason, we live in a “doctor knows best” society.

It’s important to note that the onus for this problem is not on doctors. For example, in my situation, despite that the physician said there was no science to back up the concept of healing through balancing gut microbiota, he gave us an answer that followed industry standards. He was merely following protocol. This is the heart of the point I’m trying to make: The responsibility is not on doctors to take over our health—it is on us. Few people see a physician for wellness exams, and multitudes avoid giving thought to improving nutrition through diet or supplementation, or to taking other preventative measures. Instead, most wait until there is a problem, then seek help from a medical professional who is expected to provide damage control for health that has already been allowed to spiral out of control. Then, we look to them as authority figures and miracle workers. Ask yourself this question: How can we expect these medical professionals to care for us by reversing the series of decisions we’ve made that may have escalated into illness?

This way of thinking leads to further problems that subsequently hinder our well-being. For example, when we don’t provide proper nourishment for our bodies or take measures to fortify our immune systems, it’s inevitable that we will become ill. For many, our immediate response when we’re sick is to go the doctor for antibiotics. However, these damage the gut flora, which should be made up of a balance of healthy bacteria that reside in the intestine. Over time, this harms the immune system, while offering negative bacteria and even viruses the opportunity to become stronger, causing future bouts with illness to become more severe. Additionally, antibiotics have been linked with digestive problems, fungal infections, anaphylaxis, and even kidney failure. Some studies have linked the use of these drugs to cardiovascular problems,[xvi] delirium or cognitive interference,[xvii] and even Crohn’s disease.[xviii] Yet, it seems that every time a “bug” works its way through our communities, many people (who take no preventative or immune-fortifying steps the rest of the time) take antibiotics in search of relief. We do this to our bodies, repeatedly and without question, because it’s convenient, and because a doctor tells us it’s the road to wellness.

On a more severe note, many folks (such as a previous version of myself, Joe), opt to have surgery to make permanent physical alterations at the advice of these professionals. When I had the segment of my colon cut out, I was told that it was literally nothing more than storage for waste matter soon to be defecated; there was no other purpose for that section of my colon than this. After I began to recover from the surgery and seek wellness via the natural method, my continual lack of energy became a concern. As I dug for answers, I finally found that the portion of colon I sacrificed is actually the area responsible for the absorption of B-complex vitamins, which are vital for metabolizing incoming nutrition into energy. Without the ability to take in the B vitamins, my body struggled with a severe energy lag while my taxed adrenals attempted to fill the gap by overworking—a health conundrum that required a significant recovery time. And, this revelation was not unearthed until I had engaged with a natural healthcare practitioner.




[ii] Ibid

[iii] Tedx Talks. “Intermittent Fasting: Transformational Technique.” May 15, 2019. YouTube Video, 12:44. Retrieved March 12, 2020.

[iv] Griggs, Richard. Psychology: A Concise Introduction, 5th Ed. (New York, NY: Worth Publishers, 2017), Pg. 383.

[v] Ibid.

[vi] Ibid., Pg. 384.

[vii] Ibid.

[viii] Ibid., Pg. 386.

[ix] Ibid., Pg.  387.

[x] Ibid., Pg. 388.

[xi] Ibid.

[xii] Parkash, Vinita. “The Cost of Assuming Your Doctor Knows Best.” November 13, 2017. Cognoscenti Online. Retrieved April 23, 2020.

[xiii] Ibid.

[xiv] Ibid.

[xv] Ibid.

[xvi] “Commonly Used Antibiotics May Lead to Heart Problems.” University of British Columbia. September 10, 2019. Retrieved April 23, 2020.

[xvii] “Common Antibiotics May Be Linked to Temporary Mental Confusion.” American Academy of Neurology. February 17, 2016. Retrieved April 23, 2020.

[xviii] “Antibiotic Use Linked to Crohn’s Disease.” Health 24. April 26, 2017. Retrieved April 23, 2020.

Category: Featured, Featured Articles