THINK THOSE CHEMICALS HAVE BEEN TESTED?

by D U A N E L A W, L. A c. | ( 3 1 0 ) 4 9 8 – 2 7 7 7 

by DUANE LAW, L.Ac. | (310) 498-2777 

I

s there any form of human life much lower than the litterer – the unconscious, angry or both – who dump their trash into areas we all use, forcing us all to live in a world as ugly as the one they’ve learned to ignore?

Maybe.

As we clambered our way out of the Middle Ages and into today’s industrialized world … it seemed at first as if the soot that poured out of our smokestacks would forever dissipate into an infinite empty sky.

And for centuries it did.

But then … we built more smokestacks.

Lots more.

Billions.

In economics this is called “the tragedy of the commons.” Whether it’s dumping carbon into the air, toxic waste into the environment or marketing predatory financial investments to the naive … it seems a few tend to make off with the money and leave the rest of us holding the bag.

Many of us are feeling the effects of this in our own bodies these days. The evidence grows that exposure to pollution causes a subtle, chronic, low-level inflammation. Over time the effects add up.

We call it aging. Fatigue, pain, auto-immune disease, dementia, cancer … we now know that all of these are accelerated by chronic, low grade inflammation.

Widespread pollution … even extremely dilute pollution … is one of the most unavoidable causes.

In April 2013 The New York Times published an astonishing article, So You Think Those Chemicals Have Been Tested?

It made clear just how deluded any of us were if we thought that anyone’s really looking out for our interests in the competition between corporate bottom lines and our bodies.

The governing federal law, the Toxic Substances Control Act, was passed into law and signed by Republican President Gerald Ford in 1976. It had not been substantially updated since then.

The law showed the characteristic signs of corporate/government cynicism in action. The burden of showing a chemical to be safe wasn’t on the company that invented, marketed it and profited from it. It was – and still is – on the federal government. And the system seemed exquisitely well-designed to create the impression that the public is protected while actually tying regulators’ hands quite effectively. Here’s the NYTimes commenting on the state of the art then:

“Companies have to alert the Environmental Protection Agency before manufacturing or importing new chemicals. But then it is the E.P.A.’s job to review academic or industry data, or use computer modeling, to determine whether a new chemical poses risks. Companies are not required to provide any safety data when they notify the agency about a new chemical, and they rarely do it voluntarily, although the E.P.A. can later request data if it can show there is a potential risk. If the E.P.A. does not take steps to block the new chemical within 90 days or suspend review until a company provides any requested data, the chemical is by default given a green light.

The law puts federal authorities in a bind. ‘It’s the worst kind of Catch-22,‘ said Dr. Richard Denison, senior scientist at the Environmental Defense Fund. ‘Under this law, the E.P.A. can’t even require testing to determine whether a risk exists without first showing a risk is likely.‘”

“In its over forty-year history, the EPA has succeeded in banning only five substances1 and has tested only a small percentage of the over 85,000 chemicals that were in use in 2013.”

This has been the regulatory regime that’s been in place for most of the lifetimes of everyone reading this piece.

“A basic principle of modern state capitalism is that costs and risks are socialized to the extent possible, while profit is privatized.”
                                        – Noam Chomsky

T

“A basic principle of modern state capitalism is that costs and risks are socialized to the extent possible, while profit is privatized.”


                                        – Noam Chomsky

It shouldn’t surprise any of us that toxic chemical regulation is the deepest part of the swamp. It’s a battle against simple entropy: it takes much less money and effort to dump a toxic chemical out into the environment than it does to gather it back in again and dispose of it safely. If that’s even really possible.

One can see this in the current speed with which deregulators are trashing hundreds of important safety regulations vs. the decades it took to put even the beginnings of an effective regulation regime in place.

Crucial problems also exist with the way researchers have gone about evaluating the effects of toxic chemicals in our bodies.

I’ll try to make this somewhat technical discussion as easy-to-digest as possible. Here goes:

  1. Old school LD-50 tests (the “gold-standard” of toxin testing for decades) performed on lab animals are unreliable when used as a basis to determine “safe” chemical exposure levels.2,3 This results from many factors, not least among them the fact that different species can have their own unique detoxification pathways (or not.) Aspirin kills cats. Raisins kill dogs.
  2. When toxins are tested they’re typically tested one at a time. Yet studies have shown that the rich (and unpredictable) mixes of toxins we’re exposed to have effects in combination many times greater than we’d predict simply by adding their toxic effects together. Combinations of toxins appear to have a synergistic effect when they’re encountered in combination,4,5 as they always are outside the lab.
  3. Old school “the dose makes the poison” thinking assumed that extrapolating from an LD-50 dose downward by an arbitrary amount would produce a “safe” level of exposure to toxins. Research has now definitively established however that the dose/response ratio isn’t linear for many toxins: very small doses appear to have toxic effects far beyond what would be expected from simple mathematical models,6,7,8 and early work suggesting that low toxic exposures could be considered beneficial even if higher doses were toxic have now been shown to be questionable.9,10
  4. Those same mathematical models define “statistically significant” data by ignoring those who have the strongest adverse responses (as well as those who have the least response.) This danger inherent in this approach is that it washes out data suggesting substantial populations that are atypically vulnerable to a toxin. In the past these anomalies were dismissed as statistical noise. Today we understand that there are large numbers of humans with genetics that them unusually vulnerable to toxic exposures,11,12,13,14,15 even if those numbers of those severely affected are small statistically when averaged over a randomized sampling of the general population.
  5. Viewing all these factors in the light of the astronomical cost of mounting large-scale scientific trials and today’s dominance of direct or indirect corporate sponsorship of so much research … the “small world” nature of the field and the constant awareness in the back of everyone’s mind that even careers that don’t depend on corporate funding could need it down the road … the challenges in getting crucial studies replicated … these factors intersect with the scientific community’s understandable tendency to give the most weight to the preponderance of evidence (while being largely less than concerned with the skewed nature of that evidence …) to produce an evidence base in many areas, but especially in the context of toxic safety analysis, that’s fundamentally unreliable.16,17

Here’s an prime example of the aggressive approach Big Ag can take to suppress research threatening its bottom line:

There are broad political and social implications to the misuse of scientific data to bury the plight of vulnerable genetic subpopulations exposed to toxic substances generated by our industrialized society for the profit of a relative few at the top; let’s save those for another rant …

… Except to point out that there’s a profound karma to this approach to things. Those directing these poisonous enterprises (and lobbying successfully these days for a rollback of the regulations that protect the rest of us from their activities) are just as genetically vulnerable to background levels of environmental toxicity as the rest of us. And so are their kids. Maybe even more so.

Soon I’ll go into a bit more depth about just why … and how to protect ourselves from our increasingly toxic environment. Stay tuned.

 1. Polychlorinated biphenyls, dioxin, hexavalent chromium, asbestos and chlorofluorocarbons.

 2. Rowan A. “Shortcomings of LD50-values and acute toxicity testing in animals.Acta Pharmacol Toxicol (Copenh). 1983;52 Suppl 2:52-64.

 3. Zbinden G, Flury-Roversi M. “Significance of the LD50-test for the toxicological evaluation of chemical substances.Arch Toxicol. 1981 Apr;47(2):77-99.

 4. Hayes TB, Case P, et al. “Pesticide mixtures, endocrine disruption, and amphibian declines: are we underestimating the impact?Environ Health Perspect. 2006 Apr;114 Suppl 1:40-50.

 5. Hernández AF, Parrón T, et al. “Toxic effects of pesticide mixtures at a molecular level: their relevance to human health.Toxicology. 2013 May 10;307:136-45.

 6. Crépeaux G, Eidi H, et al. “Non-linear dose-response of aluminium hydroxide adjuvant particles: Selective low dose neurotoxicity.Toxicology. 2017 Jan 15;375:48-57.

 7. Neumann HG. “Aromatic amines: mechanisms of carcinogenesis and implications for risk assessment.Front Biosci (Landmark Ed). 2010 Jun 1;15:1119-30.

 8. Benford D, Bolger PM, et al. “Application of the Margin of Exposure (MOE) approach to substances in food that are genotoxic and carcinogenic.Food Chem Toxicol. 2010 Jan;48 Suppl 1:S2-24.

 9. Thayer KA, Melnick R, Burns K, Davis D, Huff J. “Fundamental flaws of hormesis for public health decisions.Environ Health Perspect. 2005 Oct;113(10):1271-6.

10. Thong HY1, Maibach HI. “Hormesis [biological effects of low-level exposure (B.E.L.L.E.)] and dermatology.” Cutan Ocul Toxicol. 2007;26(4):329-41.

11. Marini NJ, Gin J, et al. “The prevalence of folate-remedial MTHFR enzyme variants in humans.Proc Natl Acad Sci U S A. 2008 Jun 10;105(23):8055-60.

12. Kiyohara C. “Genetic polymorphism of enzymes involved in xenobiotic metabolism and the risk of colorectal cancer.J Epidemiol. 2000 Sep;10(5):349-60.

13. Singh V1, Parmar D, Singh MP. “Do single nucleotide polymorphisms in xenobiotic metabolizing genes determine breast cancer susceptibility and treatment outcomes?Cancer Invest. 2008 Oct;26(8):769-83.

14. Gra OA1, Glotov AS, et al. “Polymorphisms in xenobiotic-metabolizing genes and the risk of chronic lymphocytic leukemia and non-Hodgkin’s lymphoma in adult Russian patients.Am J Hematol. 2008 Apr;83(4):279-87.

15. Longuemaux S1, Deloménie C, et al. “Candidate genetic modifiers of individual susceptibility to renal cell carcinoma: a study of polymorphic human xenobiotic-metabolizing enzymes.Cancer Res. 1999 Jun 15;59(12):2903-8.

16. Goldacre, Ben. Bad Science. 2008. New York: Faber and Faber, Inc.

17. Gøtzsche, P. Deadly Medicines and Organized Crime: How Big Pharma Has Corrupted Healthcare. 2013: London: Radcliffe Publishing.

Share This