The Greatest Health Revolution (Charting the Real Reasons Infectious Diseases Faded into History)
“The history of vaccination from its beginning to its present position is a refreshing illustration of the truth that medical science is human first and scientific afterwards... The refusal of the professional leaders to go back upon their mistake, when it was abundantly proved to be a mistake, has become an inherited obligation of hard swearing to successive generations. Things have now come to such a pass that anyone who undertakes to answer for Jenner and his theories, must shatter his own reputation for scientific and historical knowledge. Most of those who have a reputation to lose decline the challenge.” (1890)
— Dr. Charles Creighton, MD, Professor University of Cambridge, author of numerous writings, including History of Epidemics in Britain (vol 1 & 2), Bovine Epidemics in Man, Cowpox and Vaccinal Syphilis, Jenner and Vaccination: A Strange Chapter of Medical History, and Vaccination in the Encyclopædia Britannica in 1888
“More than twenty years ago I began a careful study of the subject of vaccination, and before I got through, I was forced to the conclusion that vaccination was the most colossal medical fallacy that ever cursed the human race. Few physicians attempt to investigate this subject for themselves. They have been taught to believe its efficacy. They have vaccinated because it was the custom and they were paid for it. They have supposed vaccination would prevent small-pox because the best authorities said it would, and they accept it without question.” (1888)
— Dr. A. M. Ross
“The fact that an opinion has been widely held is no evidence whatever that it is not utterly absurd; indeed, in view of the silliness of the majority of mankind, a widespread belief is more likely to be foolish than sensible.” (1929)
— Bertrand Russell, Philosopher
“By the time laboratory medicine came effectively into the picture the job had been carried far toward completion by the humanitarians and social reformers of the nineteenth century... When the tide is receding from the beach it is easy to have the illusion that one can empty the ocean by removing the water with a pail.” (1959)
— René Dubos
Gone Viral! – The Unbelievable Reason Why Trump Always Carries This Special Salt Everywhere He Goes!
The Unquestioned Dogma of Vaccination
Despite recent, highly polarizing events at HHS—such as the appointment of Robert F. Kennedy Jr. as Secretary, which was celebrated by certain members of society and met with consternation by others—vaccination remains the unchallenged, go-to solution, often deployed without even the hint of a critical question. On August 5, 2025, HHS Secretary Robert F. Kennedy Jr. announced the termination of 22 mRNA vaccine development projects under the Biomedical Advanced Research and Development Authority (BARDA), totaling $500 million in funding, citing data showing that these vaccines were ineffective in protecting against upper respiratory infections such as COVID-19 and influenza. The reallocated funds were shifted to support an alternative vaccine project. As reported on May 1, 2025, the U.S. Department of Health and Human Services (HHS) and the National Institutes of Health (NIH) announced the launch of the “Generation Gold Standard” program, a $500 million initiative to develop a universal vaccine platform. Using a beta-propiolactone (BPL)-inactivated, whole-virus approach, the program seeks to provide broad protection against multiple strains of pandemic-prone viruses, including influenza and coronaviruses. Ultimately, the prevailing orthodoxy dictates that it’s not a matter of whether vaccination itself is safe and effective or the correct course of action, only which particular technology to pursue.
The near-religious faith in infectious disease management through vaccines and antibiotics has been deeply embedded in our societal consciousness. Whenever a discussion involves a supposed contagious agent, the only real solution proposed is vaccination, as we saw during the COVID-19 era. In addition, the notion that vaccines are inherently “safe and effective” is a near-Pavlovian response when the word is invoked. This mantra hearkens back to when Edward Jenner declared over two centuries ago, without any rigorous evidence, that his invention of vaccination was to be given with “perfect ease and safety” and would make “rendering through life the person so inoculated perfectly secure from the infection of small-pox.”
For decades, there have been endless disputes about the health of vaccinated versus unvaccinated individuals, potential links to neurological damage or autism, the toxicity and safety of vaccine ingredients, long-term immune system effects, and the ethical and regulatory implications of mandatory vaccination, as well as catastrophes such as the Swine Flu Fiasco of 1976 and the Cutter disaster. The fact that Edward Jenner’s statements were demonstrably false virtually from the outset, coupled with these long-standing arguments, makes it all the more remarkable that the word “vaccination” has continued to hold a near-divine status—regardless of what ingredients it contains, who manufactures it, the absence of long-term safety testing, the presence of conflicts of interest, or the mounting evidence of harm.
These disagreements will not end anytime soon, much like other enduring debates in the public sphere—from the best diet, the amount of alcohol (if any) that is safe to drink, and the benefits or harms of coffee to the impact of red meat, the value of supplements, and broader lifestyle questions such as exercise routines, sleep requirements, and screen time.
On May 7, 2025, the NIH–CMS Data Partnership announced its plan to create a real-world data platform that integrates claims, electronic medical records, and consumer wearable data to investigate the root causes of autism spectrum disorder (ASD) and inform long-term strategies.
To believe this system will ever definitively resolve the issue is a form of magical thinking—or perhaps more cynically, a way to placate the public and continue business as usual. Powerful institutional and ideological forces are likely to do everything in their power to delay, obfuscate, or manipulate the findings; even carefully conducted studies are destined to be dismissed, misrepresented, or weaponized to support pre-existing narratives rather than pursue the truth. Given the deeply polarized nature of the debate, any result released after years of work and the investment of countless dollars—if a clear and worthwhile result is ever produced—will be immediately contested, with each side convinced the conclusion was incorrect, flawed, or an out-and-out lie.
But there is a far simpler, cheaper, definitive, and faster way to assess the impact of infectious diseases and vaccination—a method that cuts through the noise: examining existing mortality data. Preventing deaths, after all, is the main claim to fame of vaccines, and you might imagine this critical data is something policymakers have relied upon for decades. You might also assume that this data is featured prominently on the websites of institutions such as the CDC (Centers for Disease Control), the NHS (National Health Service), and the AAP (American Academy of Pediatrics). However, you would likely be sorely disappointed. While the raw data exists, it is rarely presented in a clear, historical context. Few charts that vividly illustrate trends—such as pre-vaccine versus post-vaccine era mortality—are easily accessible. Instead, this crucial information is often buried in detailed reports and surveillance databases, fragmented across countless pages, or locked away within academic publications.
United States Mortality Data (1900–1965): A Story Told in Trends
While this essential data is not prominently displayed, it can be accessed through some research. For example, by utilizing historical United States Vital Statistics data, we can construct a mortality chart tracking deaths from diseases like scarlet fever, whooping cough, measles, diphtheria, and typhoid over a significant period: 1900 to 1965.
The story this chart tells is nothing short of revolutionary. What was immediately shocking—and what completely dumfounded me nearly 30 years ago—was the revelation that the measles death rate had already fallen by a staggering 98% before the introduction of the first vaccine in 1963. Like most people, I had assumed that measles mortality would be high right up until the vaccine's arrival, after which it would drop precipitously. But that is not what the data revealed; it was the complete opposite of what I, and nearly everyone else who saw this chart for the first time, expected.
Another critical observation is that the mortality rate from whooping cough had already declined by an astonishing 95% by the time its vaccine was introduced in 1948. Once again, this reality contradicted my initial assumption that deaths would have remained high until the vaccine's arrival. Instead, as with measles, the vast majority of the decline occurred long before widespread vaccination.
Furthermore, the data reveal an even more compelling point: two other diseases noted on the chart, scarlet fever and typhoid, declined to near-zero levels in the absence of any vaccine at all.
MUST WATCH!!! These common medicines could become unavailable in 2026!!!

The advent of antibiotics, such as the mass production of penicillin in 1944 and streptomycin in 1947, occurred well after the overwhelming majority of the decline in mortality had already taken place. This timeline powerfully reinforces the conclusion that factors other than modern medicine were the primary drivers of this trend. This view is strongly supported by a seminal study published in Pediatrics in December 2000, which concluded:
“...nearly 90% of the decline in infectious disease mortality among US children occurred [from 1900] before 1940, when few antibiotics or vaccines were available.”
England and Wales Mortality Data (1838-1978): An Even More Dramatic Decline
The evidence from the United States is compelling, but the story becomes even more definitive when we look across the Atlantic. Unlike the United States, England and Wales began gathering data 62 years earlier, in 1838.
By synthesizing data from the Office of National Statistics and other historical sources, we can construct a comprehensive mortality chart that tracks death rates from major infectious diseases—scarlet fever, whooping cough, measles, diphtheria, and smallpox—over a sweeping 140-year period: from 1838 to 1978.
The trend revealed by this data is even more extraordinary than that of the United States. Mortality from these five diseases was massive throughout the mid-1800s, beginning a sustained and dramatic decline around 1875. The scale of this pre-vaccine improvement is staggering: the death rate from whooping cough had fallen by over 99% before the vaccine's national introduction in 1957. Even more mind-boggling, the mortality rate from measles had declined by over 99.9%—effectively nearly 100%—prior to the national rollout of its vaccine in 1968.
Perhaps most strikingly, scarlet fever—a far deadlier killer than either measles or whooping cough in the 19th century—declined to near-zero levels without any vaccine ever being developed. Once again, the advent of antibiotics occurred only after the vast majority (roughly 99%) of the mortality reduction had already been achieved.
The Overlooked Revolution: Sanitation, Nutrition, and the Fall of Mortality
The trajectory is key: The direction and magnitude of the trend are unmistakable. The death rate was falling precipitously for all infectious diseases due to profound public health and societal advancements, including better nutrition, sanitation, electrification (which improved food safety through refrigeration and reduced indoor air pollution from oil lamps), and improved transportation, which began to replace urban horse populations and their associated waste. Furthermore, the abandonment of harmful medical practices like bloodletting and toxic medications containing mercury or arsenic reduced iatrogenic harm, while social reforms like the implementation of labor and child labor laws, the closing of dreadful basement slums in favor of better housing, and a general rise in the standard of living all contributed to stronger public health. This collection of factors established a powerful, pre-existing downward trend in mortality.
As astutely observed by W. J. McCormick, M.D., in the 1951 issue of the Archives of Pediatrics, the most significant factor in the historical decline of infectious diseases was not medical intervention but what he termed an “unrecognized prophylactic factor.” McCormick’s analysis compellingly argues that while medical advances are celebrated, they often arrive after the most dramatic reductions in incidence and mortality have already occurred. He posits that a broader, underlying force was primarily responsible for the improvement in public health.

This perspective challenges the conventional narrative that attributes the decline solely to vaccines, antibiotics, and specific public health measures. Instead, it forces us to consider the profound role of socioeconomic and environmental improvements—such as better nutrition, less crowded housing, improved sanitation, cleaner water, and higher standards of living—which collectively strengthened human immunity.
“The usual explanation offered for this changed trend in infectious diseases has been the forward march of medicine in prophylaxis and therapy but, from a study of the literature, it is evident that these changes in incidence and mortality have been neither synchronous with nor proportionate to such measures. The decline in tuberculosis, for instance, began long before any special control measures, such as mass x-ray and sanitarium treatment, were instituted, even long before the infectious nature of the disease was discovered. The decline in pneumonia also began long before the use of the antibiotic drugs. Likewise, the decline in diphtheria, whooping cough and typhoid fever began fully years prior to the inception of artificial immunization and followed an almost even grade before and after the adoption of these control measures. In the case of scarlet fever, mumps, measles and rheumatic fever there has been no specific innovation in control measures, yet these also have followed the same general pattern in incidence decline. Furthermore, puerperal and infant mortality (under one year) has also shown a steady decline in keeping with that of the infectious diseases, thus obviously indicating the influence of some over-all unrecognized prophylactic factor.”
The 1977 study by epidemiologists John B. McKinlay and Sonja M. McKinlay presented a groundbreaking and contrarian analysis of 20th-century U.S. mortality data. Their rigorous investigation led them to a conclusion that contradicted the mainstream view: modern medical interventions, including vaccines and antibiotics, played a surprisingly minor role in the historic reduction of death rates since 1900.
Their work demonstrated that for the majority of infectious diseases, the most significant decline in fatalities had transpired prior to the development of specific medical treatments. The research suggests that medical tools were implemented after broader societal advancements had already accomplished the heavy lifting.
“In general, medical measures (both chemotherapeutic and prophylactic) appear to have contributed little to the overall decline in mortality in the United States since about 1900—having in many instances been introduced several decades after a marked decline had already set in and having no detectable influence in most instances. More specifically, with reference to those five conditions (influenza, pneumonia, diphtheria, whooping cough, and poliomyelitis) for which the decline in mortality appears substantial after the point of intervention—and on the unlikely assumption that all of this decline is attributable to the intervention... it is estimated that at most 3.5 percent of the total decline in mortality since 1900 could be ascribed to medical measures introduced for the diseases considered here.”
This minimal figure of 3.5% stands as a powerful challenge to the conventional account of medical history. The McKinlays’ analysis suggests a need to reorient our understanding of health progress, emphasizing the paramount importance of foundational public health and socioeconomic conditions over purely clinical, after-the-fact interventions. Their research remains a critical, evidence-based counterpoint in discussions of health policy and investment.
You’ll Understand Everything After Watching This VIDEO!
Severe Disease Becomes Mild: A Forgotten Historical Fact
The historical data reveal a dual phenomenon: as mortality rates from infectious diseases plummeted, so too did the perceived severity of these diseases. Decades before widespread vaccination, the terrifying specters of whooping cough and measles were transforming into milder childhood ailments, a trend observed and documented by physicians in the field. This gradual attenuation of disease severity further challenges the narrative that medical intervention was the primary driver of improved health outcomes.
By the 1950s, the experience of measles had shifted so dramatically that a British GP could confidently state its harmlessness was a matter of common understanding. The disease was seen not as a threat, but as an inevitable and manageable part of childhood:
“In the majority of children the whole episode has been well and truly over in a week… In this practice measles is considered as a relatively mild and inevitable childhood ailment that is best encountered any time from 3 to 7 years of age. Over the past 10 years there have been few serious complications at any age, and all children have made complete recoveries. As a result of this reasoning no special attempts have been made at prevention even in young infants in whom the disease has not been found to be especially serious.”
Similarly, whooping cough (pertussis) was increasingly characterized by its mild presentation. Research in general practice populations confirmed that the classic, severe portrayal was no longer the norm, a fact that changed how doctors approached diagnosis and how parents were counseled:
“Most cases of whooping cough are relatively mild. Such cases are difficult to diagnose without a high index of suspicion because doctors are unlikely to hear the characteristic cough, which may be the only symptom. Parents can be reassured that a serious outcome is unlikely. Adults also get whooping cough, especially from their children, and get the same symptoms as children.”
This well-documented change in the natural history of these diseases led some in the medical community to openly question the necessity of universal vaccination policies. The cost-benefit calculus, for some, no longer seemed to favor mass intervention:
“…it may be questioned whether universal vaccination against pertussis is always justified, especially in view of the increasingly mild nature of the disease and of the very small mortality. I am doubtful of its merits at least in Sweden, and I imagine that the same question may arise in some other countries. We should also remember that the modern infant must receive a large number of injections and that a reduction in their number would be a manifest advantage.”
The epidemiological trends bore out these clinical observations. The decline in whooping cough incidence and mortality was a firmly established trajectory years before mass vaccination campaigns began, suggesting the vaccine was introduced into a landscape where the disease was already receding:
“There was a continuous decline, equal in each sex, from 1937 onward. Vaccination [for whooping cough], beginning on a small scale in some places around 1948 and on a national scale in 1957, did not affect the rate of decline if it be assumed that one attack usually confers immunity, as in most major communicable diseases of childhood... With this pattern well established before 1957, there is no evidence that vaccination played a major role in the decline in incidence and mortality in the trend of events.”
Together, these accounts paint a compelling picture: the conquering of childhood diseases was a complex process involving a profound reduction in both lethality and severity, driven by deep-seated societal changes that began long before the advent of vaccines.
Real-World Cases: When Falling Vaccination Rates Did Not Cause Crisis
Conventional public health wisdom suggests that a decline in vaccination rates will inevitably lead to a resurgence of disease and death. However, historical case studies from Sweden and England present a starkly different picture, challenging the very foundation of this belief. In these real-world natural experiments, the cessation or dramatic reduction of pertussis vaccination had no discernible impact on mortality, forcing a critical re-examination of the vaccine's role in the context of long-term epidemiological trends.
Sweden's Rationale for Halting a Failed Vaccine
In the late 1970s, Sweden conducted a thorough investigation into a pertussis outbreak that yielded an astonishing result. The data revealed that the vaccine was not merely imperfect; it was functionally ineffective.
“In 1978, 5,140 bacteriologically verified cases of pertussis were reported to the National Bacteriological Laboratory, Stockholm. Investigation of a subsample showed that out of 620 children aged 1-6 years with the disease, 521 (84%) had received three injections of pertussis vaccine. Another investigation disclosed that 84% of 38,015 preschool children born during 1974-8 in various regions of Sweden had been given three injections of pertussis vaccine... Since the Swedish-made pertussis vaccine evidently lacked protective effect, vaccination was stopped in 1979.”
Faced with overwhelming evidence of the vaccine's failure to protect and growing concerns over its safety profile, the Swedish health ministry made a rational, data-driven decision: they recommended the discontinuation of the whooping cough vaccination program in 1979. For the next 17 years, Sweden had no pertussis vaccine in its national program. Contrary to apocalyptic predictions, this prolonged period without vaccination did not result in a significant increase in whooping cough mortality, a powerful demonstration that other influential factors were sustaining the low death rates.
England's Natural Experiment in Vaccination Decline
A similar, albeit unintentional, experiment unfolded in England during the 1970s. Widespread public fears over the safety of the whole-cell DTP vaccine caused vaccination rates to plummet from a peak of over 75% to a low of 31% by 1978. According to the prevailing narrative, this collapse in coverage should have triggered a devastating epidemic and a sharp rise in deaths. The reality, however, was far more nuanced.
While case notifications increased, the most crucial metric—mortality—remained unmoved. As authors of a 1984 study confirmed:
“…fears about whooping cough vaccine caused a dramatic fall in immunisation rates and in consequence a large increase of notifications. Despite this increase, the number of deaths has not risen...”
A deeper analysis of the data reveals a paradox that defies simple explanation: in 1971, when vaccination rates were at their peak (78%), pertussis deaths were at their highest for the decade. Conversely, in 1978, when vaccination rates had cratered to just 31%, the death rate was less than half of what it was at the height of coverage. This inverse relationship severely undermines the claim that the vaccine was the primary factor controlling mortality.
Furthermore, placing this episode in its broader historical context is essential. The downward trend in whooping cough deaths in England began decades before the vaccine's introduction in 1957 and continued its steady, pre-existing decline irrespective of vaccination coverage levels.
The secrets of the explosive health of the richest families in the world!!!
These secrets also belong to the Trump family! (Because their health and energy regardless of age has proven this argument over time)
By watching the video below, you can learn these secrets and even apply them yourself! Just be careful until the end…
Interrupting Vaccination: A Challenge to the Narrative
The experiences of Sweden and England provide powerful, evidence-based counterarguments to the standard belief of vaccine efficacy. In both cases, the removal or drastic reduction of the pertussis vaccine did not lead to the catastrophic outcomes forecast by models. Instead, mortality rates remained low and stable, continuing a long-established downward trajectory that began long before mass vaccination.
These case studies force a critical question: if the vaccine was not responsible for the suppression of mortality, what was? The evidence points to the same “unrecognized prophylactic factors” that drove the initial decline—improved sanitation, nutrition, and living standards—which continued to exert a dominant influence on population health, rendering the impact of vaccination on mortality statistically invisible in these specific historical instances.
The Influenza Enigma: A Case Study in Vaccine Futility?
The historical decline in influenza and pneumonia mortality is a public health success story, but one with a puzzling second act. Between 1900 and the 1970s, deaths from these causes had declined by a remarkable 90%, a testament to the impact of improved living conditions and medical care, albeit with room for further improvement.
It was at this point that mass influenza vaccination began. The subsequent data reveal a deeply counterintuitive trend: following the vaccine’s introduction, mortality rates actually increased from the late 1970s through the early 2000s. Vaccination rates for the most vulnerable population—adults aged 65 and over—climbed to 60-70%, and only over the next two decades did the mortality rate slowly decline back to approximately the same level it was at the start of the vaccination era. This trajectory suggests that, despite widespread public acceptance and annual deployment, the influenza vaccination program has failed to make a demonstrable dent in preventing deaths, performing no better than the pre-vaccine baseline.
This failure is not an outsider's critique but an acknowledgment from the highest levels of the scientific establishment. In a seminal 2023 paper, Anthony Fauci and his colleagues at the National Institute of Allergy and Infectious Diseases conducted a sobering review of six decades of efforts.
“As of 2022, after more than 60 years of experience with influenza vaccines, very little improvement in vaccine prevention of infection has been noted. As pointed out decades ago, and still true today, the rates of effectiveness of our best approved influenza vaccines would be inadequate for licensure for most other vaccine-preventable diseases...Taking all of these factors into account, it is not surprising that none of the predominantly mucosal respiratory viruses have ever been effectively controlled by vaccines... Durably protective vaccines against non-systemic mucosal respiratory viruses with high mortality rates have thus far eluded vaccine development efforts.”
This stunning admission lays bare a fundamental truth: the challenge of controlling mucosal respiratory viruses, such as influenza, through vaccination has proven to be a monumental scientific hurdle, one that decades of research have failed to overcome.
Given this abysmal record of performance based on population-level mortality data, a critical question demands an answer: Why are these programs continued with such vigor? If the stated goal is to reduce mortality, and the primary intervention has, by its own architects' admission, failed to achieve that goal after 60 years, the justification for its continued promotion relies less on proven efficacy and more on unquestioned dogma. The case of the influenza vaccine stands as a powerful example of a medical intervention that persists despite its inability to alter the fundamental statistical reality of disease mortality.
The Smallpox Narrative: A Critical Re-examination
The conventional history of smallpox and vaccination is a tale of medical triumph. However, a closer examination reveals a far more complex and unsettling story, one that is often omitted from public discourse. The reality of smallpox eradication involves not just a vaccine, but a century of questionable practices, unexpected epidemiological trends, and an acknowledgment that factors beyond vaccination played the decisive role.
The Grim Reality of Early Vaccination
The implementation of Jenner's vaccine in the 19th century was a far cry from modern sterile procedures. The “vaccine” was a substance of unknown origin—often pus scraped from sores on cows, horses, or goats, or even matter taken from human smallpox corpses. For nearly a century, the predominant method was “arm-to-arm” vaccination, where the lymph from the pustule of a vaccinated child was used to inoculate the next. This practice was not outlawed in England until 1898. It was this “vaccine” material that was delivered by creating numerous small cuts in a person's arm using a sharp instrument called a lancet and smearing it into those cuts.
This chaotic and unsanitary process was a gamble with human health, a fact noted by contemporary critics. As George William Winterburn, PhD, MD, starkly observed in 1886:
“It will thus be seen what slight foundation the whole question of vaccinal virus rests. Millions of vaccinations are made every year, and nobody knows what they are made with. The whole process is a haphazard game with chance. Vaccination was accepted on the simple dictum of Jenner that it would stamp out smallpox. The medical profession of today buys its vaccinal virus of [sic] those who make merchandise of it on their simple dictum that it is the right thing to use.”
Even with improved methods in the 20th century, bacterial contamination was an inevitable and accepted part of the vaccine. As one source noted:
“With the best of care, heavy bacterial contamination of vaccine lymph is inevitable during its preparation, and as many as 500 million organisms per ml. may be present…”
The procedure itself was often a traumatic and reckless assembly line. Dr. G. H. Merkel’s 1882 account paints a horrifying picture:
“The surgeon sat on a box in the storeroom, lancet in hand, and around him were huddled as many as could be crowded into the confined space, old and young, children screaming, women crying; each with an arm bare and a woe-begone face, and all lamenting the day they turned their steps toward ‘the land of the free.’ The lymph used was of unknown origin, kept in capillary glass tubes, from whence it was blown into a cup into which the lancet was dipped. No pretence of cleaning the lancet was made; it drew blood in very many instances…”
A Century of “Criminal Experimentation” and Harm
As one might imagine, injecting “cultivated rottenness” of unknown origin into countless individuals had devastating consequences. Over a hundred doctors are documented as criticizing the practice, recognizing it as a vector for disease rather than a preventive measure.
One such critic, Dr. D. Albert Hiller, offered a scathing indictment in 1902 that remains powerful today.
“Jenner’s suggestion was acted upon by all who had a receptivity for such expedients, and those were generally negative practitioners; this suggestion culminating in a century of such radical vaccinating and experimenting on humanity that a truly healthy human being is the exception rather than the rule, in ‘civilized’ nations. That Jenner’s suggestion turned out to be but a hallucination, to say the least, this century of criminal experimentation on the human race has amply proved, for whatever variola has appeared in localities where sanitation and isolation of the patients has not been the rule, its effects have been as bad.
The microscope, with which the bacteriologist can plainly prove by ocular demonstration the rapid multiplication of bacilli, microbes, and whatnot, should prove an eye-opener to those who yet follow the pus infiltration doctrine of Jenner. If a hundred years of vaccination has not proved its efficacy, what will? The very process used upon a vaccine virus farm should convince every one that the rapidity with which they there multiply matter, in a calf or heifer, holds the same condition good when that virus is inoculated into the human body. When this cultivated rottenness is once grafted into the groundwork of human flesh and blood, promulgation of disease - germs is the inevitable consequence, for the process of involution takes care of it by natural law, so that this very vaccine-virus-rot will evolute into almost any form of disease in the body which, either through inheritance or debilitating environments, has for it a special attraction or a receptive condition. Erysipelas, septicemia, spinal meningitis, paralysis, amaurosis [partial or total blindness without visible change in the eye], amputation of limbs, diphtheria, tetanus, phthisis, and finally death, have too often been absolutely traced to this pernicious practice, and the fact can neither be ignored nor successfully denied.
We hear of pure virus, as being wholly innocuous, but there cannot be pure virus, for it is pure rottenness. The physician can only procure his vaccine-virus from the middle-man and can have no actual knowledge of its characteristics until after the damage is done to the patient, and then, in case of trouble, the physician has all that he can do to protect himself; in each abortive vaccination which proves fatal or even detrimental the facts are carefully suppressed or so skillfully distorted that the culprit vaccinator may save his reputation and evade legal responsibility, and the truth is so smothered that it seldom finds its way into public knowledge, and thus are “statistics” made or not made to suit the predominant practitioners and their allies-the vaccine virus manufacturers.”
The Vaccination Failure and the Sanitary Solution
Despite a century of aggressive vaccination, smallpox continued to ravage populations. A massive epidemic in 1871 provided damning evidence of the vaccine's failure. The Lancet reported:
“The deaths from smallpox have assumed the proportions of a plague. Over 10,000 lives have been sacrificed during the past year in England and Wales. In London, 5641 deaths have occurred since Christmas… Of 9,392 patients in the London Smallpox Hospitals, no less than 6,854 had been vaccinated, i.e., nearly 73 per cent. Taking the mortality at 17.5 per cent. of those attacked, and the deaths this year in the whole country at 10,000, it will follow that more than 122,000 vaccinated persons have suffered from smallpox!... Can we greatly wonder that the opponents of vaccination should point to such statistics as an evidence of the failure of the system?”
The decline of smallpox, when it finally came, followed the same pattern as all other infectious diseases: it coincided with broader societal improvements and, surprisingly (or perhaps not), as vaccination rates themselves were falling. By the early 20th century, astute observers recognized that sanitation, not vaccination, was the true conqueror.
“For forty years, corresponding roughly with the advent of the “sanitary era,” smallpox has gradually but steadily been leaving this country (England). For the past ten years the disease has ceased to have any appreciable effect upon our mortality statistics. For most of that period it has been entirely absent except for a few isolated outbreaks here and there. It is reasonable to believe that with the perfecting and more general adoption of modern methods of control and with improved sanitation (using the term in the widest sense) smallpox will be completely banished from this country as has been the case with plague, cholera, and typhus fever. Accompanying this decline in smallpox there has been a notable diminution during the past decade in the amount of infantile vaccination. This falling off in vaccination is steadily increasing and is becoming very widespread.”
The disease itself became markedly milder, often indistinguishable from chickenpox. Mortality had fallen by over 98% by 1939, as noted in a 1940 article in Public Health Reports, becoming a negligible threat long before the World Health Organization's eradication campaign began.
Shortly after World War II, in 1948, compulsory vaccination was finally ended. As noted by C. Killick Millard, MD, in the British Medical Journal, smallpox deaths had become extremely low despite a very low vaccination rate.
“...in Leicester during the 62 years since infant vaccination was abandoned there have been only 53 deaths from smallpox, and in the past 40 years only two deaths. Moreover, the experience in Leicester is confirmed, and strongly confirmed, by that of the whole country. Vaccination has been steadily declining ever since the “conscience clause” was introduced, until now nearly two-thirds of the children born are not vaccinated. Yet smallpox mortality has also declined until now quite negligible. In the fourteen years 1933-1946 there were only 28 deaths in a population of some 40 million, and among those 28 there was not a single death of an infant under 1 year of age.”
The Uncomfortable Conclusion: What Really Banished Smallpox?
The triumphant narrative of vaccination's victory over smallpox is a cornerstone of modern medicine. Yet, a thorough examination of the historical evidence reveals a far more profound and humbling truth: the decline and disappearance of smallpox followed the same inexorable pattern as every other infectious disease, pointing to a cause far greater than any single medical intervention.
This view is supported by experts who have studied the epidemiological data. In 2003, Dr. Thomas Mack, a renowned epidemiologist, directly challenged the herd immunity dogma, asserting that the virus was vanquished not by the needle but by progress itself.
“If people are worried about endemic smallpox, it disappeared from this country not because of our mass herd immunity. It disappeared because of our economic development. And that’s why it disappeared from Europe and many other countries, and it will not be sustained here, even if there were several importations, I’m sure. It’s not from universal vaccination.”
This expert testimony confirms the undeniable historical trajectory. Smallpox mortality began its permanent decline alongside improvements in sanitation, nutrition, and living standards—the same “unrecognized prophylactic factors” that tamed whooping cough, measles, and tuberculosis. Smallpox became milder and less frequent, eventually fading into obscurity as a natural cause of death before the global eradication campaign of the 1960s and 70s began.
Therefore, the story of smallpox is not a simple vindication of vaccination. It is a complex and often dark history of a brutal procedure that caused significant harm, failed to prevent major epidemics despite widespread use, and was ultimately rendered obsolete by the same forces that conquered all infectious diseases. The credit belongs not to a lancet or a vial, but to the profound power of improved sanitation, nutrition, and human living conditions. This conclusion does not merely revise history; it fundamentally reorients our understanding of what truly creates a healthy society.
Conclusion: Rethinking the Germ-Centric Notion of Disease
These mortality statistics and other information presented here lead to one inescapable and critical conclusion: the notion that a single microbe defines a disease is incorrect. The mortality rates for whooping cough, measles, and scarlet fever had already collapsed before medical interventions like vaccines were introduced. This proves that broader factors—such as public health and nutrition—are far more critical in determining survival than any pathogen itself.
Despite this evidence, our society wholeheartedly embraces a simplified model of disease that credits the microbe for causing illness yet overlooks the multifaceted factors that actually saved lives and reduced mortality by nearly 100%.
There is a lot more to the vaccine story… but this short introduction may make you rethink what you thought you were certain was the absolute truth.



Great stuff, and it debunks the jab theory thoroughly. I would only add one thing: herd immunity is impossible through vaccines because their temporary humoral immunity wanes over time and the immune system does not adjust to mutations from it like it does with permanent intracellular immunity. Because of waning and non-adjustment, there is always a significant segment of the population that loses immunity, which makes the ever-moving goalpost of herd immunity impossible. We haven't seen that yet with COVID-19, but we can expect to see in the next 1-3 years.
ReplyDeleteThere is an omission in your commentary more important than the hazards of vaccines. There are no viruses to vaccinate against, since no virus has ever been proven to exist.
ReplyDeleteNo studies support-demonstrate that contagion is real; on the contrary hundreds of studies failed to demonstrate contagion, which falsified the naive idea of contagion. See for instance Dan Roytas' book Can You Catch a Cold? The idea of viruses was invented to explain (naive belief in) contagion. No contagion means no viruses.
Second, no "virus" particles have ever been isolated to attain a pure specimen of "virions" to use in experiments to try to demonstrate pathogenicity. Koch's Postulates regarding imaginary viruses have never been fulfilled. News Flash: This is a logical problem. Again, no isolate of "virions" means no testing for infectivity, nor replication, nor pathogenicity, nor spread. Also, no isolated "virus" means it is not possible to know if an alleged virus has DNA-RNA, or a protein capsid, or genetic identification. Antibodies have never been proven to function as such and react to a large array of antigens, not just a single alleged viral protein. This whole parcel of notions and logical fallacies is junk.
People who write about vaccine hazards but neglect to discuss lack of proof or evidence for "viral infectious diseases" are unimpressive.