So are they increasing or are we just figuring them out more and living longer and keeping more kids alive? I've got a spinal arthritis condition that didn't start until I was mid 30s and it's not a new condition but the thinking around it has changed substantially.
Edit: the consultant I saw said I probably wouldn't get diagnosed 20 years ago when she started. I had trouble convincing my doctor it could be this condition. I'd never have known about it unless I'd mentioned it to my mum who told me my sister (who I don't have contact with) has this quite severely and it took her 10 years to get a diagnosis and access to helpful drugs.
This study was published from a journal that got caught publishing a questionable study on a major health issue a few years ago that they later had to retract. This was despite their reputation of being one of the best and peer-reviewed. So I don't blame you for questioning it.
But I'm a parent with children with issues. And everything we study from other very mainstream sources says their issues are on the rise. These are issues that you can't ignore or not recognize.
I think phrases like “a journal that got caught publishing a questionable study” imply a level of malice that isn’t appropriate. Journals publish studies all the time with some degree of peer review and a fraction of them are going to have problems, plus there is always the possibility of deliberate fraud that peer-review systems are explicitly not designed to catch. You shouldn’t believe every published study is 100% accurate just because it was published. But you also shouldn’t use phrases like “got caught” unless your goal is to imply deliberate malfeasance.
Journals operate in an information brokerage business and they are compensated to ensure reasonable review of what they publish. What you wrote would be applicable to something like reddit but you can't accept funds for a job that you don't carry out.
The problem here is that journals are relying on an open source peer review process for which they pay nothing, and then trying to cash in on the fruits of that. Blaming the reviewers is like blaming the medieval serfs for the fact that their landlords sold you rotten grain.
Even if they were paid, it would not check for this. Peer review checks formalities, obvious methodological problems, citations, that sort of stuff. It is not an independent study meant to validate data themselves.
I've been doing some searches since I asked the question and I wouldn't say there's a satisfactory answer. A mainstream source just saying that autoimmune disorders are on the rise isn't good enough, especially as most articles I've read couch that with "and we now recognise more of these disorders" or something similar.
There seems more evidence that allergies are on the rise. Maybe. I note that I found a lot of articles saying things like "autoimmune conditions are a Western disease" and blaming modern lifestyle. But arthritis is as old as the hills.
I'm not at all denying that they are actually on the rise, but nothing really seems to tackle this in a way that answers all the questions - is it on the rise or do we just detect and categorise it now. Does it occur because people live longer? In the past did people have these conditions and not know, have to live with it or die? I know that for my back condition people in the past got it and their back bones fused together over time, nowadays people tend to get treated with anti-inflammatories and now biologicals plus exercise regimes to prevent it. As there is a wide range of intensity within each autoimmune disorder the majority of people who had a condition in the past might be unreported.
So if anyone can point to articles or papers that address all those things I'd be very interested.
That's a lot of tough questions. I don't think answers are lacking, but you will find anything and its exact opposite.
Who funds this and that research?
You will find that pharnaceuticals in their desire to show growing market opportunities to their backers, and grow, will make very well sure to fund enough groups so that some of them come up with the conclusion: these issues are on the rise and don't appear to relate to environnemental exposures, even adding that treatments are or will be made available to diagnosed patients.
You will then find another paper, likely funded by an NGO working on reducing the use of pesticides, or other surely health impacting industrial habits, that there is undeniable correlation between the intensive use of chemicals in agro and the rise of auto-immune disease, with a lag of X years, whatever make the numbers look legit.
I'm not qualifying scientific research in general as biased towards the backers' incentives, but there is a lot of noise in statistical discoveries, scientists mostly interested in publishing something since that's their financial prerogatives to stay in the game.
With all that being said, I will give my "opinion" which may answer part of your questions, but my conclusion wasn't reached by the standard of scientific methods:
- we are ingesting and getting exposed to a wider range and higher quantities of toxic compounds, often so synthetic.
- the immune system is a complex machine, that tilts and can overreact when on alerts, tends to have good memories, and can trigger or retrigger oversensitiveness, all for our survival's sake, it isn't de facto a disease to have the immune system take the wrong fights.
- that immune system, shown to not always discriminate properly, probably having evolved to be rather safe than sorry.
- we don't fully understand the immune system, maybe we barely do.
Is it possible that we diagnose as a disease what might simply be the expected consequences of long term exposures to undersirable chemicals? That pharma got to sell something so they don't have any problem marketing chemical therapies and strongly promoting that rather than suggesting addressing the causes might be a better option?
It wouldn't be the first time in history that physicians insist with the known treatments for symptoms they ignored the causes.
Again take this with a pinch of salt, i am not advising against drug treatments altogether but took the opportunity to remind this thread that health care is also a business.
From a non medical scientist who happen to have allergies and skimmed through a few hundred papers on the subject.
The idea that the rates of disease increase when screening increases is widely known. Any epidemiologist who tries and analyze data and model disease trends accounts for a number of factors that can impact prevalence - screening rate, screening accuracy, etc.
The number of people with a disease can stay static, but the number of diagnoses can increase dramatically with widespread screening and accurate tests.
Moreover it’s a vicious (virtuous) cycle. To get better funding to research a specific disease experts will play up how important their research is by playing up the severity, the “increasingness” of the thing.
With funding secure you can now roll out more research studies that tell practitioners all about the disease and how it’s “increasingly important”. Now doctors will test more patients for the disease, maybe over diagnose it too. And so and so forth.
Humans are flawed. Humans make research. We should be more skeptical about supposedly “established” science.
Given that fentanyl overdoses are now the leading cause of unintentional death for young people in the USA, if that trend continues then we should expect to see higher average levels of opioid resistance evolve over the next few generations.
I also expect to see hormonal birth control become less effective over many generations. Those for which birth control fails are likely to have more children.
Of course, all of us here will be long dead before any exploration pressure exerted by those drugs becomes apparent at a population level.
No, that far overestimates the speed and impact of natural selection.
Having <1% of population die before breeding because of some factor (no matter if drug overdoses or something else) is not a sufficiently big pressure to nudge towards an opioid resistance in just a few generations; if there today were some meaningful sub-population with such a resistance, an effect so tiny as the current overdose deaths might result in that resistance spreading over hundreds or thousands of generations; it would take a few more orders of magnitude more overdose deaths to have an effect in just a few generations. Similarly for hormonal birth control, especially if it fails, it has an above-average likelihood of being aborted.
I mean, all these things can happen, but not after we're dead, but at the time scale of year 3000+ by which time I'd expect that medical technology would allow us to control these genes directly, making the direction of natural selection irrelevant.
It's an amusing thought, but I don't understand how you can apply such a darwinian rulebook here unless you conflate behavioral traits with bio-molecular pathways. Do you think abstinence is part of opioid resistance, or that forgetting to take pills is part of hormone resistance?
Most people _not_ dying of fentanyl overdoses are not taking them in the first place, rather than surviving exposures due to sturdier nervous systems. I think a normal interpretation of the phrase "opioid resistance" would mean that their systems can tolerate a higher dosage.
Conversely, most birth control failures are thought to be behavioral differences. I.e. whether the user is aware that protection may be absent due to missed doses or other scenarios called out in the instructions.
Maybe these two forces will cancel out since one selects for more discipline and the other for less...
But that's exactly what it is. Over time, willingness to take the pills, and ability to follow through on intent to take the pills, will be bred out of the population.
pretty much - we don't adapt our form to our environment, rather when a problem arises we develop a technology that removes the discomfort we feel, which would otherwise act as an adaptation driving pressure. We develop our tech such that any segment can be readily learned as an isolated component, and as such our memory isn't under pressure to improve. We outsource memory to digital storage, and so there is even an allowance for memory to shrink.
Overall, all our tech looks at helping us not have to change, grow, or adapt. We want to live a life of comfort but comfort means no pressure and no pressure means no adaptation reinforcement of advantageous mutations, since what is most advantageous is being average in every way.
The notion that “we don't adapt”, though, is really a euphemism for “people, who otherwise might not, are surviving to reproduce and are therefore not selected out”. But the truth is that death is still inevitable and people are still being selected out, and we are still, therefore adapting. The only thing that has changed is what environment we are being selected for: Earth in the Anthropocene.
The technology-caused changes of environment are so recent that we wouldn't have had adapted to them no matter what. Adaptation takes a long time, the last significant adaptation we've had is lactose tolerance (lactase persistence mutation) which was unusually fast to spread and took only a few thousand years to do so. A few centuries is so small time (for human generations - it's different for bacteria or insects) that we should expect literally zero adaptation even if the "pressure to improve" would be the same.
Always evolving, but technology has surely influenced natural selection - a lot of illnesses/conditions that might have killed people prior to them reproducing are no longer factors in that regard
i mean yeah i knew a bunch of weak sickly kids growing up who wouldnt've survived natural selection. instead they will go on to breed more scrawny allergy ridden asthmatics
i wonder how this carries over to mental stuff. like autists or psychopaths might have gotten forced out of society a few hundred years ago so they couldn't reproduce as much, might explain why the former is so much more common? idk
There is absolutely no reason to think psychopaths were excluded from society. Chances are, they were very successful and run even more unchecked then now.
Cultures had good psychopath tests then too, like this is where dares originate from. You could get excluded quickly. After religion and tech arrived I think on balance it's more comfortable for them.
Yes we can’t ignore these issues. I always challenge people to attempt all kinds of mitigations upstream, before medicating downstream. This is also true for diabetes and autoimmune diseases — for example, stop buying food from factory farms, or fruits with tons of genetically engineered sugar or foods with tons of high-fructose corn syrup or aspartame, and tell our government to tax negative externalities, rather than “ag-gag” laws that criminalize exposing it. Your media will rarely talk about the role of actual sugar intake in diabetes, preferring to focus on “inherited” and “intrinsic” factors, and same with many other medical issue the pharma industry will medicate downstream.
The government specifically works with industry to allow corporations to get away with saving a buck at the expense of the wider public, while the wider public is given a medications and bandaids downstream that decades later turn out to be scams: https://magarshak.com/blog/?p=362
Once you see it, you can’t unsee it. Like when the government allows Coca Cola and Snapple bottling companies to switch to metric tons of non-biodegradeable plastic, while individuals are told they can’t have a plastic straw or bag, and can save the world by recycling plastic (a scam by the plastic industry to distract from the upstream problem, nearly all of the “recycling” consisted of shipping to China).
Or how factory farms are now causing more negative externalities, which we are now finding out last month:
I invite you to consider the same dynamic with autoimmune diseases and others. It’s the same question of whether kids always had this level of gender dysphoria and ADHD and autism, or did something happen in the environment, or did the industry start to overdiagnose them?
The truth is probably somewhere in the middle, but activists on either side insist on their explanation being the dominant one, or perhaps the only one. Similarly to how other activists might insist on systemic sexism or racism to explain a gender pay gap, or systemic antisemitism to explain criticism of Israel, etc. (Disclaimer: I’m a Jewish straight cis male.)
This is an impulse by people with a specific agenda / issue to make sure their agenda “wins”. In every case, there are real problems that need to be solved upstream, but the focus by well-meaning activists is only on downstream solutions (eg prescribing amphetamines to ADHD-diagnosed kids) rather than upstream solutions (eg shortrr school days and allowing kids to climb trees like in Finland, where ADHD has far less diagnosed). They will spend decades advocating the same thing and expect a different result — it’s not collective insanity, it is actually learned helplessness.
It is common to think this way when one’s “tribe” is involved. To illustrate it, I will touch on an issue with dynamics that have affected many people on HN. I hope it’s alright to mention it, because that is the only way you’ll see that the “downstream focus” even concerns many sacred cows.
So — to hit home, this even happened on HN with depression. I have seen it for years, where any attempts to attribute some diagnosed depression to upstream causes like diet and exercise and environental factors are met with “you just don’t understand, clinical depression is completely different, and you should be quiet”.
In each of these issues, the science shows that the theory that promotes medical interventions downstream is often wrong, eg in 2022 we found this out when it comes to the theory of chemical imbalance / low serotonin explaining depression:
But it’s easy for society to just prescribe medications to people rather than fixing systemic issues, eg implementing a UBI or 20 hour workweek overtime protections so people can spend more time with their family. In capitalism we’d rather allow companies to social problems with surveillance capitalism, social networks and AI to optimize clicks, because that’s what the market demands.
On the other hand there are tons of studies and books on how the market and new technology has led to the breakdown of American social and family life. Starting with “Bowling Alone” by Robert Putnam back in the early aughts. People in big cities don’t even know their own neighbors. They live alone with their cats, order in netflix and uber eats, they stick their kids in public schools and parents in nursing homes, but hey, the economy is good right?
Anyway to end with autoimmune diseases… notice how much more effort it would take for our society to stop producing plastics in clothes, that break down into microplastics on every wash, aren’t caught in drains and now find themselves everywhere, humans accumulate more and more microplastics. Rather than confront these environmental issues head-on (as we did once with CFCs and Toronto Protocol, the one big win for solving collective action problems) we allow the microplastics to accumulate in all biological beings, including us, throw up our hands and wonder aloud what could be causing autoimmune diseases? Meanwhile it is more convenient to open a new plastic spoon from a plastic package shipped from china, than wash a metal one. Instead of refilling bottles like our ancestors, we buy new ones every day and chuck them in the “recycling” LOL. Oh, and polyester is just so stretchy and convenient, as is teflon non-stick coating on our pans, why demand our government tell the industry to phase out nonbiodegradeable stuff and research biodegradeable alternstives? Nah that’s too hard to do as a society. So the problems get worse every year and “activists” arise who attribute it to a systemic issue like sexism or racism — they think they’re a great ally but rather than fighting each other we should band together to demand our govenrnment aggressively tax negative externalities upstream with Pigouvian taxes.
Fiscal policy btw would be a great way to remove stimulus and UBI money from the economy. Certainly much nore targeted and better than crude monetary level the Fed has, raising interest rates on everyone. But our politicians are scared to raise taxes on industries, or go up against Big Ag or Big Pharma. So we as a society allow Day Zero to occur in cities, antibiotic resistant bacteria to grow, etc. all because of regulatory capture and cowardice. It will continue unless people recognize that the real problems are upstream, stop fighting each other and clearly spell out how we can organize to demand our government raise taxes on industries externalizing costs to others. If there was ever a proper role for government, it is that!
PS: with AI, experts say 10% chance that humanity is wiped out… yet we can’t stop the competition and progress, and certainly when people are affected we’ll have industries to help them out downstream. Just another example, perhaps may be the biggest and fastest one soon.
Researchers can intentionally induce depressive symptoms in mice by injecting lipopolysaccharides. This is somewhat speculative but it's possible that some cases of depression in humans could be caused by gut disorders that allow large amounts of LPS to pass into the bloodstream.
It's not even speculative. We know bad gut microbes dump LPS, and we can and have quantified increased bloodstream LPS levels in people with numerous issues, depression and chronic fatigue among them.
I looked into this a while back, and I believe there are a number of different autoimmune conditions that are definitely going up with no cause known (corroborated by multiple studies). One is linked
I also am a bit troubled by how quick people are to dismiss such studies. I think people are operating off of a prior of "big shifts in health don't just happen," which in my opinion is a demonstrably false prior. I think the default prior should be "We're in completely new health waters, testing in production, anything is possible and problems are known and not explained."
In particular nobody can explain why eyesight is getting drastically worse. Obesity in the developed world is a big unknown (I think even lab rats weigh more). I believe there's consistent evidence of allergies increasing. And the balance of evidence is that sperm counts are going down, but for whatever reason I think people assume that's a conspiracy theory or just are afraid to believe it.
What level of evidence do you require for those explanations? Because all three observations you mention have credible theories, some even have partial proofs.
nobody can explain why eyesight is getting drastically worse
The prevailing theory has to do with lack of exercise of the eyeball, i.e. children spend too much time indoors locked at a single viewing depth; children who spend more time in the outdoors tend to have better eyesight. In China they're running broad studies which seem to indicate that children's eyesight stops deteriorating when exposed to more bright daylight: https://clinicaltrials.gov/ct2/show/NCT05156190
Obesity in the developed world is a big unknown
Sugary/unbalanced foods and sedentary lifestyles? The proofs we have go as far to document the entire systemic changes that ultimately lead to insulin resistance: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6471380/
sperm counts are going down
There's suspected links to endocrine-disruptive pesticides, but FAFAIK no proven mechanisms.
Even if the two you offer theories for are correct, it doesn't really apply to my overall point which is: Big health shifts have happened in unforeseeable ways before and we have no reason to think there won't be any more of them.
Also wearing glasses when they're not needed. Your eyes will adapt to the accommodative stress presented. If you look at a screen all day, you'll end up with a power of roughly -2 assuming you don't wear any glasses looking at your screen, since your eyes have adapted to treat that 2 feet as optical infinity so it can remain in focus without any accommodation needed. If you then go ahead and wear prescribed glasses for close-work where they are not needed, you again restart the process.
Interestingly it has been found that only periphery matters for this process, that's how those fancy glasses like Hoya Miyosmart work, by intentionally defocusing the periphery.
I believe the increase in allergies (at least some common ones) has been linked to lack of exposure. There was a study many years ago showing that kids raised in houses with dishwashers tended to have more allergies than houses without them. This is correlation, of course.. was it the dirtier dishes that exposed me to more stuff as a kid than my wife was exposed to? Was it something about detergents being used? Not sure, but anecdotally my wife and I certainly reflect this correlation. Similar to not growing up around cats — more likely to have a cat allergy. Don’t ask me where all the latex allergies popped up from. Is that still as big a deal these days or did it go away? I’ve heard of parents deliberately trying to avoid their kids having allergies to things (like peanut butter) by essentially embarking on an exposure regimen from a young age.
Spondylitis? I’m curious mostly because I received a similar story and diagnosis, but over the last 5-10 years have had multiple doctors second guess what may be affecting me.
Yes. Once my mum told me to get a test for the HLA-B27 gene, which was positive, my doctor referred me, while still being doubtful. Consultant didn't second guess at all, she looked over my x-ray and family history and said it was pretty clear. It might seem like bad news but it's such a massive relief knowing and having a path forward treatment wise as I get older.
I’ve had as AS diagnosis for over a decade now and it was definitely a relief. Just having a name for why I wasn’t “normal”. Why everyone around was able to just do so much more than I could. That this was a real thing, I’m not somehow faking it, and there’s someone who can help.
I wish you luck and hope you find some physical relief as well!
Life expectancy after age 5 has not budged much over the centuries, though you’ll find most graphs begin at a trough circa 1900. Prior to that it was about the same as it is now in many countries.
So “we’re living longer” is not the answer, it probably has much more to do with the massive amounts of new chemicals and pollutants we’ve exposed ourselves to. No barriers to transport along with unrestrained trade and frequent migration also mean were exposed to way more pathogens than we evolved to encounter, so some of them could also be blowback from legitimate immune responses.
https://ourworldindata.org/its-not-just-about-child-mortalit... has life expectancies for age ranges from 1841 and shows a huge difference for a 5 year old. If conditions get worse with age then the average going from 55 to 82 (from age 5) could hide a lot of data about these disorders.
> Analysis of the mid-Victorian period in the U.K. reveals that life expectancy at age 5 was as good or better than exists today, and the incidence of degenerative disease was 10% of ours.
You see this basically throughout history except in very bad times like famine or major plagues, or were there is persistent dietary issues as we introduced in the late 19thc, or early on in the switch to farming. Rural areas also tend to do better than urban.
> Measles alone meant you likely lost kids in that age range.
I get what you're saying, but Measles is definitely the wrong thing to put here. Most people over the age of 35 aren't going to get what you're saying.
To be fair, most of us also don't know the pain of an infected wound. Has sadly gone down the same route as so many folks thinking they have had the flu.
Otherwise healthy people rarely die from nicks. Me and my kids regularly get cuts or miscellaneous wounds and I’ve never needed antibiotics or to do anything besides wash them with water.
Measles is fairly low mortality in absolute terms. Even within living memory, getting the measles and the mumps was a normal childhood experience. The overall mortality is 1 or 2 per thousand and the disease is most dangerous in under 5s and over 20s. So before widespread vaccination, everyone was exposed before their 20s and we excluded under 5s from the count.
Yes, I live in a rural area and work outside a lot. I also get scratched to bleeding by animals fairly frequently. My kids are magnets for all kind of mysterious injuries. It is simply not common to get serious infections from everyday wounds for otherwise healthy people. Indeed, for most of human history, cities had higher mortality than the countryside due to disease. "Random cuts" are not nearly as dangerous as "lots of people."
“Deadly” can be true for measles in young children but again, it’s important to be clear that the actual mortality is very low. It certainly contributed (with many other diseases) to lowering life expectancy for infants, but it was not even close to a death sentence; the vast majority survived. As I said, it was basically a rite of passage even for our grandparent’s generation.
Working outside and getting scratched a lot in modern clean clothes, sterile tools, and with refrigerated food is still very very different to not having those things.
Let's circle back and remember that my post was a study of birth, death, and medical records, to which the response was "that doesn't feel right."
The data suggests your feelings are incorrect, and that things were not as dangerous as you feel they are. I'm not even sure what you mean in this case at all: people still washed their clothes, garden tools are not any more sterile now than they were then, and refrigeration can be worked around by different dietary and food storage choices. (Ironically the need for refrigeration partially came with societal advancement: fresh milk for example was traveling farther and farther and increasingly transactional circumstances, which meant it was more likely to carry food-borne diseases whereas, when you go further back, nobody was waiting days to drink their fresh milk and thus didn't worry about it.)
I’ve no doubt that there was a statistical impact, but it is probably rendered invisible by such things as skyrocketing obesity, heart disease, and cancer rates caused by the modern environment and lifestyle. This is probably especially true when you consider the QoL of many older people today. This is also apparent when the cited article mentions that many fewer people died of heart disease, but they did die of heart damage caused by infections. They lived just as long, you might think antibiotics would keep us going even longer, but we tread water because now we simply die of heart disease at about the same age or a little younger instead.
Certainly, my post is not meant to be a refutation of the study. I can express why it feels wrong, still.
Most of the feels is from personal experience with almost losing kids and spouse to infections. In the modern world. :) They 90% would be dead in the past.
Similar experience with grand parents. Though, they did pass somewhat early.
I mean, sure, getting chicken pox was relatively common when I was a kid, and few people died of it. But a low percentage of all the population is still a large number of people.
Makes sense. In movies I see things like a dirty arrow or a sword is guaranteed death from infection. Assuming the movie portrayal is accurate what is the differentiating thing that causes a dirty wound from a weapon to kill but not a dirty knick?
Well, movie portrayals of anything are generally not accurate at all.
I would say that infections from battle wounds are probably more likely, but that probably has more to do with the surface area and size of the wound and health leading up to the battle. A big scratch from a briar or a chicken might bleed a lot, but the wound will be closed fairly quickly. A big scratch from a sword is going to take a lot longer to close, and you might have been near-starving and short on sleep for weeks beforehand and have been crammed in tight conditions with 10k other guys.
That said, it's still greatly exaggerated to say a war-wound is guaranteed death. People got hurt in battle a lot and generally made it out OK. You can find lots of famous examples where even to me it's quite shocking they were able to survive at all without modern medical care. For example, not only was the hard-partying Alexander the Great wounded about a jillion times in battle, one of his last battles involved a punctured lung. It probably eventually contributed to his death sometime later, but it's surprising to me he even lasted the night!
This is tricky. With enough people, yes, a number will survive battle wounds. Most people died, though.
Consider, 50 to 70 million deaths in WW2. And that isn't looking at more genocidal like methods earlier humanity almost certainly used. Numbers would be lower, but percentage higher.
Yes, we are resilient, but that has limits. And is mostly overcome by having more kids.
This is just not true dude. Life people died in all age from all kind of curable issues.
Mortality of women during child birth is massively lower. That is the most apparently obvious one. Whole classes of diseases from malnutrition are down or don't exists or are fixable (anemia, lack of iodine, that sort of thing). Tuberculosis was frequent enough to have sanatorium specializing in that. Lepra.
When you read history of epidemics, of food and generally of medicine you find a lot of premature deaths that just don't happen now.
> Life people died in all age from all kind of curable issues.
Sure, absolutely. Just not in large enough numbers to drag the numbers down lower than today [after age 5], when we've dramatically increased the number of deaths due to preventable issues. For example, heart disease was much rarer, and cancer incidence was less than 10% of today's numbers. Fewer people died in car accidents, and fewer were shot by AR15s. The causes shifted around. We made medical progress, but we took steps back as well.
As I also mentioned, there've been peaks and troughs. As the paper I cited mentions, health became so bad after the mid-1800s peak, by the late 1800s and early 1900s, the British army had to lower its height requirements to absurd levels because of malnutrition - malnutrition that was a direct consequence of technological advancement!
The hygiene hypothesis is pretty well proven now as far as I know. Its the idea that early childhood exposure to various germs is important to tune the immune system correctly. It atleast partially explains the rise of autoimmunity. From the wiki:
The rise of autoimmune diseases and acute lymphoblastic leukemia in young people in the developed world was linked to the hygiene hypothesis.
I wonder how our global lifestyle affects this too. You might have had a really dirty childhood in the UK and adapted to all the dirt and toxins there, but then you take a job in Nevada or something and you don’t have childhood immunity to anything there.
I’d love a study done on autoimmune disease prominence in expats vs local populations.
I know one study found celiac is increasing, not just being diagnosed more (although it is also being diagnosed more). I'm not sure of other autoimmune diseases.
(Increased diagnostic capability seems unlikely to be a factor in T1D's prevalence -- when it is triggered, untreated T1D leads to obvious and serious symptoms (eventually death through ketoacidosis) over the course of months. It's not something you can just ignore and power through; treatment is required to survive.)
I have osgood schlatter syndrome, basically a bone growth on the knee. Had it for decades but not diagnosed until 30s during physical for new job. Zero impacts. No pain. No limitations. No issues whatsoever. But I do have a named syndrome that, in years past, would never have been noticed.
I recall somewhere that due to us getting better at protecting people from pathogens, immune systems get “less practice” at identifying friend from foe, increasing the likelihood of contracting an autoimmune disorder down the line.
I have no idea how accurate that is, but it’s what over heard.
Personal anecdote: I have Graves disease, and it didn't flare up until a couple years after I finished immune therapy for a variety of severe allergies. The allergies are much better! But my immune system got bored, or something. Correlation vs causation though... causation makes a lot of sense when living it, but it's not proof.
I'm also quite certain my great-grandmother had this same thing, but no would have been able to diagnose her as anything but excessively thin back then. (It severely increases your metabolism and can cause osteoporosis.)
It could be true for some of them, but it's also possible that "autoimmune" is not a very good description of what's happening for some others.
Disorders lacking an obvious cause seem to get lumped into the autoimmune category whenever the symptoms are related to inflammation. Before modern medicine, these disorders might have been masked by or passed as a milder form of tuberculosis or something else. It's still possible that there are pathogens involved that haven't been identified.
this is the "hygiene hypothesis" and there is a correlation but nobody has shown causation. it was popular 10 years ago but other models have become more prominent
I think this is a true increase, but don't have anything to back this up. One major reason it might be increasing is our increasingly sterile world. Less exposure to nature, animals, etc , especially as children, might be preventing proper immune function. I also wonder about chemical exposures - food additives, vaccine adjuvants, manufacturing chemicals, etc. Although I think many of those suppress immune function rather than increase it. It is interesting that we don't know how some of these work, but use them anyways. For example, we don't know the mechanisms for vaccine adjuvant work. There's a theory about molecular size, and we know they increase immune response. There's a lot we don't know about the immune system in general.
you mean comparing before and after COVID? longitudinal is better, but those take more time to produce so these kinds of studies come first. Also, almost everyone has COVID now, so it's even harder to do that.
It's a lil crazy that people are so willing to say well we let it rip, and because of that, we can ignore any problematic findings because we can't compare current populations easily and across long periods of time. The most likely hypothesis, based on known mechanisms and studies, is that yes, COVID infection will fuck you up. This is the case with most human pathogenic viruses, even ones that have a "mild" initial presentation like EBV (multiple schlerosis), CMV (wears down immune system in old age), Chickenpox (shingles), HPV (cancer), or HIV (immune dysfunction).
The fun thing is, COVID sometimes has an acute life-threatening aspect as well, so it is probably double the fun.
No, I just mean that the study observed an increase in incidence of autoimmune disorders which cannot be explained by COVID because of the window of collection.
> Over the study period, age and sex standardised incidence rates of any autoimmune diseases increased (IRR 2017–19 vs 2000–02 1·04 [95% CI 1·00–1·09]).
I completely agree with COVID both being generally dangerous even outside of the acute infection and that it has likely caused a substantial increase in autoimmune disorders. This study just indicates that there are likely other environmental factors at play as well that predate COVID.
"Method A cohort was selected from German routine health care data covering 38.9 million individuals. Based on documented diagnoses, we identified individuals with polymerase chain reaction (PCR)-confirmed COVID-19 through December 31, 2020. Patients were matched 1:3 to control patients without COVID-19. Both groups were followed up until June 30, 2021. We used the four quarters preceding the index date until the end of follow-up to analyze the onset of autoimmune diseases during the post-acute period. Incidence rates (IR) per 1000 person-years were calculated for each outcome and patient group. Poisson models were deployed to estimate the incidence rate ratios (IRRs) of developing an autoimmune disease conditional on a preceding diagnosis of COVID-19."
Sorry I missed this - I was referring to the study in the OP, not the study you linked. I agree that the study you linked indicates a higher incidence of autoimmune conditions among those who had covid.
Not sure, but the difference between a measured amount of bound spike protein and an uncontrolled amount of fully functional spike + other evil proteins like those derived from orf8 and orf6 means I fully expect the virus to be the main problem here.
I don’t know if you know this but the vaccine is not a measured amount of bound spike protein at all. It’s mRNA that gets transcribed over and over - each mRNA molecule creates many spikes. Exactly how many depends on many factors in your body.
When I’m at a PC again I will link you to some studies showing cases where it exceeds the amount you’d expect in a natural infection and there’s documented cases where it continues weeks after injection unlike the virus which doesn’t last that long in healthy young people.
That's fair, but it's a fixed amount of RNA and RNA degrades. The virus can continually create new RNA. COVID has been shown to persist in a fraction of patients, so it will continue for months or years. I don't see how these situations are comparable. Again, the vaccine spike is edited to be bound in the pre-fusion conformation whereas the natural spike is flexible and can bind to ACE2 and expose additional internal inflammatory sites.
"Young, healthy people" makes enormous assumptions about a tiny slice of the population. After one or more COVID infections, they may acquire the dreaded "co-morbidities" that a huge fraction if not a majority or supermajority of the population have. Despite how we like to think we have personal control over our circumstances, in this situation, we do not. It is a collective effort and we are failing to control transmission.
Vaccines are part of this, as are masking with N95 respirators, testing, air cleaning, paid sick leave, scientific isolation and quarantine periods, etc. I am hopeful that with targeted or universal air cleaning combined with partial population immunity (maintained via periodic vaccine updates) and paid leave, we can push the reproduction number below one. This is a very serious crisis that is worsening in some ways, but we are acting like the physical environment has no bearing on our lives if we think positively.
Anyone know about this idea about low fiber diets making it easier for bacteria to get through the intestinal lining? And that this bacteria is likely a cause for autoimmune issues?
I read about this about 10 years ago in the NYTimes and haven't seen it in print since. But the last 3 doctors I've seen (in different cities), I've asked them about this and the responses have ranged from "research is showing this is likely the case" and then another one responded with a flat out "yes, that's correct." Yet I don't see anyone actually talking about this.
Very interesting. The increase in Coeliac disease will mostly be due to the advent of a useful blood test to detect it. The lack of association between other autoimmune diseases and multiple sclerosis fits with the recent identification of Epstein-Barr virus as a probable causative agent. The reduction in pernicious anaemia could be explained by reductions in the prevalence of Helicobacter pylori (due to effective treatment and improvements in food handling).
There's another confounding factor regarding celiac diagnosis, though its impact may be negligible: the lack of unique treatment options.
I know someone who learned through an elimination diet that they are sensitive to gluten. Maybe they actually have celiac disease, but they currently have no reason to investigate further -- the only practical "treatment" is avoiding gluten, which they are already now doing.
Furthermore, apparently the blood test for celiac disease requires the patient be exposed to gluten for some time before the blood is drawn. So they'd have to start eating gluten again (and thus suffering) to learn whether what they have is celiac disease.
The foods we eat today have higher amounts of Omega 6. Theres a ratio between omega 6 and omega 3, 3:1, 2:1 for healthy omega 6 to omega 3 balance. when Omega 6 is high it seems to correlate to autoimmune disease and allergic diseases along
> The notion that modern culinary oils containing high amounts of PUFA, particularly LA, are inflammatory is an idea that has managed to permeate virtually every diet camp imaginable. The paleo/ancestral nuts believe it. The vegan/plant-based nuts believe it. The keto/carnivore nuts believe it. Zealots belonging to Ray Peat's diet cult definitely believe it too. But is there actually evidence for this claim from human experiments?
Intrigued?
Make sure you’re not just regurgitating things to be true that you’ve merely heard repeated a thousand times online. Because that got me too not long ago.
The anti seed oil sophistry is one of the most ubiquitous nutrition misconceptions today. On social media, at least.
I don't buy any specific anti-seed-oil claims, but it generally seems like a bad idea to dramatically change where people get a high % of their calories from using a food source that has has in some cases (canola) never before been consumed in humans and in others was consumed only in much smaller quantities. The safest bet is probably to eat diets as closely resembling those of your healthy ancestors as reasonably possible. Human biology is so insanely complex that we're simply never in our lifetimes going to be able to intelligently meddle here.
That we evolved with a food doesn’t mean it’s better for us. There’s an antagonistic pleiotropic effect where our genes make trade offs for our fertile years at the incidental expense of our long term health, and that also incurs trade offs made with so called “ancestral” foods.
I also don’t see how a naturalistic fallacy is more convincing than human health outcome data. If novel foods like canola oil are worse than adapted foods like saturated fats, how come health outcomes improve when you swap saturated fat with canola oil?
In other words, if our best nutrition intervention data doesn’t meet your standard of evidence, then what is the alternative that does? Hand waving over some cozy claims about ancestral foods, evidence be damned?
Specific studies on human health outcomes for different diets show wildly varying things. I’ve seen studies that show terrible effects and studies that show positive effects for canola oil. I also know there’s a huge reproducibility crisis and that I can only critique these studies so far. You say the “best data” shows you’re right, and I’ve seen people with a lot of credentials and other studies that say they’re right that it’s bad.
But what I can do is look at the data we have for overall current human health outcomes, which are absolutely catastrophic considering the advances in medical technology. Clearly we’re eating something we shouldn’t, being exposed to something we shouldn’t, and/or doing things we shouldn’t, on a mass scale, beginning relatively recently.
These seems like as decent a culprit as any. The fact some shmuck in the 1870s London cesspool had a better life expectancy than me is disturbing. We’re clearly doing something massively wrong. I can’t control for pollutants and I get as much activity as I can, and diets the only other thing I can try to control. When we have bad outcomes and multiple changes at once, all I know to do is try to roll back the changes I have control over.
You also have to be careful about how you determine what a "negative" health outcome is. A lot of leaps are taken in the conclusions of papers about various studies.
Thanks for sharing. This is an intriguing article and I'm very impressed with the depth of research, site design, and logical debate quality of the several paragraphs I've read (so far). I've been looking for great takedowns of the anti-PUFA advice (I love fatty fish and omega 3s!) so looking forward to reading the rest.
It's worth noting that this site appears to be the same type of "extreme diet camp" that the author pokes fun at. He appears to be a vegan (https://twitter.com/TheNutrivore) and advocate for pro-vegan views. Overall he seems to have the same epistemology MO as, eg., Ray Peat -- a cranky obsessive lay-person who primarily blogs at length for topics that interest him. I picked another article off his site at random and his own biases become quite apparent (https://www.the-nutrivore.com/post/a-systematic-appraisal-of...).
The above preamble has no bearing on the PUFA question! I personally love cranky obsessive bloggers and genuinely look forward to reading more from this site. Open and honest debate, ftw.
I don’t understand. The blog author’s epistemological standard is set at the balance of evidence that we have available to us. I don’t think you can do better than that, else you’re left with sophistry, convenient falsehoods, and zealotry.
Who cares which “camp” you reason yourself into if you actually reasoned yourself into it using real arguments and scrutinized syllogisms?
Most people don’t do anything like that. We usually just defend our creature comforts or we, for example, decide what we want to eat and then work backwards to cling on to anything that seems like it might validate us.
Meanwhile, Have you read Ray Peat? He’s the opposite of someone reasoning about the balance of evidence which is why he spends most of his time talking about the bottom of the hierarchy of evidence; anecdotes and rat studies.
It's not just seed oils, the same accusations are made about the "modern" diet in general regarding ratios of various fats. The thing about seed oil in particular seems to be a newer fad.
That study says that the risk factor is red meat consumption, not seed oils:
> The typical American and Northern European diet is high in omega-6 FAs, with the average person having an omega-6 to omega-3 FA intake ratio between 15:1 and 18:1. These populations tend to overconsume red meat and underconsume unprocessed oils and cold-water fish.
You can google what foods have high omega 6:3 ratio - plant seed oils are just one culprit of many. Livestock is largely fed corn derived feed so that’s why our meat is high in omega 6 content.
So, stick to canola oil over olive oil if you want lower omega 6 ratios. Stick to unsaturated fats over sat fats.
People who fixate on omega 6 and seed oils usually dismiss the downsides of saturated fat, as a rule. They will go on about the dangers of seed oil based on something that isn’t scientific consensus, yet the overwhelming balance of evidence against saturated fat doesn’t meet their
epistemic standard.
The stronger your immune system is the higher are your chances to survival. Until your immune system is so strong it attacks it’s own host.
Evolutionary pressures will make sure that organisms are very close to the edge between to weak and to strong.
One naive way of interpreting this is that 50% of the population will have a immune response that is to strong (above the edge) and 50% to weak (under the edge). But it is naive to say that, as this is a topic that has much more complexities than the one dimensional model that I have described here.
At what percentage the threshold of the statistics for what we call autoimmune disease would be located would be interesting to know.
I don' think you understand a lot of what you write. Microplastics are largely inert and most of them are much much bigger than your immunity cells. If they could somehow 'attack' it, they wouldn't be able to consume it (and if they could nothing would change, just like ink in tattoos remains in cca same place even after immune cells consume it).
What would happen during 'attack' is rather a clot forming, not very good ie in your bloodstream. Kind of pearls forming randomly in your body, but the seed would be a piece of plastic instead of a grain of sand, and it would be just your dead immunity cells around it. The chance of getting rid of it gets much smaller. Now why would you want something like that.
That study was specifically on Australian firemen IIRC. Firemen are exposed to more of those particular chemicals because of their widespread use in flame retardant sprays.
As someone on similar medication for health reasons (severe testosterone deficiency), I can provide some more specific information that might be helpful.
The issue with testosterone is that it's linked to an increase in hematocrit (level of red blood cells in blood). There's a band of hematocrit levels that's healthy for donation. If the hematocrit level is too high, it's potentially dangerous to the recipient, but too low a level of hematocrit makes the donation dangerous to the donor. As a result, hematocrit level testing is one of the few tests that they do before every single donation.
These chemicals aren't dangerous, they're present in all humans. No blood clotting is hemophilia, which is also a dangerous condition. Regular blood donation can prevent build-up, and the decrease in hematocrit after blood donation is one of the reasons for the required interval between donations.
I'm sorry to hear that you're dealing with that. If you're comfortable doing so, I wonder if you could share more details about how increased testing is involved? Thank you.
Not OP, but generally if you increase testing, you find more instances of disease, especially for diseases with lower/hard to gauge impact (ones that people may not have gone to the doctor for in the past).
Also not OP, but I recently got diagnosed with (another) autoimmune disorder solely because a single doctor twisted my arm into getting some unpleasant tests done. Turns out I have Crohn's disease and not just an unusually capricious stomach. (Also, thankfully, turns out Crohn's symptoms get significantly better with treatment.)
But if I hadn't been willing to power through the time and money and discomfort of getting tested, I probably would've remained undiagnosed for years. I'm thankful we caught it now and not before things got bad enough to land me in the hospital.
_Anecdotally_ I’ve noticed a huge uptick in colleagues and friends with some kind of autoimmune problem in the last 3 years, especially those with severe symptoms. Looking forward to any good study into this problem which was already growing even before the current “covid age”.
I would be interested in seeing some kind of long-term study of the appearance of reported symptoms that are associated with autoimmune disease, and not just the diagnosis of autoimmune disease. If rates of reported symptoms in the population are increasing, Then it's suggestive that the prevalence of disease is increasing, whereas if diagnoses are increasing but reported symptoms or not, it's indicative of better testing and a stable prevalence of disease.
Severe symptoms would be painful and persistent Rheumatoid arthritis, MS, Myositis, and some condition requiring regular infusions of immunosuppressant drugs in hospital (I did not wish to pry as to the exact nature of that one). It’s my understanding that all of these were quite easily diagnosed in their respective patients as they were so severe.
Mostly this is because we have more reliable tests for autoimmune disease, and more are being discovered. I wouldn’t be quick to jump to a boogeyman like crystals, COVID, fluorine, magical wind chimes as the cause.
I have a feeling that microplsatics, PFAS, chemical contamination, digestive problems due to modern diets and like the like are all significant precursors to autoimmune disorders.
Wouldn’t that mean that infections are also a major source, as more people are infected by virus than are vaccinated by against them? Most vaccines excluding the new MRNA vaccines are just versions of the original virus, dead or weakened.
No, because the argument for vaccines causing autoimmune disorders is largely not about live viral vaccines (or at least not them on their own) but rather vaccines that contain adjuvants (and not all do). The idea is that the adjuvants result in the immune system targeting many proteins (not just the proteins ones in the vaccine).
My (pharmacy student) daughter has a plausible theory about that.
Most (all?) vaccines have something in them to make the immune system recognize that there's something there to fight. (It might me called an "adjuvant" or something like that - I'm not the pharmacy student, so I'm probably getting the word wrong.) It's included in the vaccine that you get from the manufacturer.
That was fine, when you only had a very few vaccines to get. But now kids are supposed to get something like 18 different vaccines. Well, doctors know that the mom isn't going to bring their kid in 18 different times to get vaccinated. So they give the kid five different vaccines at a visit.
So the kid gets five doses of the adjuvant (or whatever it's called) at once.
For some kids, that's enough to make their immune system hyperactive. And behold, they develop autoimmune diseases.
As I said, this is a plausible theory. It's not proven. But it makes some sense, and it explains what changed.
OP’s comment is “just asking questions” couched in a folksy, disarming manner, intended to seed doubt about adjuvants when the word they were feigning ignorance of is “antibodies.” This is slightly more sophisticated than normal anti-vax comments, but is too cute by half.
I too have had to learn about adjuvants recently, straight up because of a new (new to me, anyway) wave of anti-vax sentiment specifically framing adjuvants as sinister. The last six dozen ingredients/compounds deemed threatening have not found widespread traction; this is the latest attempt.
And solar-generated electricity still carries the dangerous UV radiation. That's why skincare companies are the biggest sponsors of the solar energy revolution.
This is as disingenuous as OP is speculative. There are plenty of studies in non-bullshit journals demonstrating that millimeter-wavelength electromagnetic fields can affect voltage-gated calcium channels in the body. It seems highly unlikely that increased long-term exposure above natural baseline would cause autoimmune disease, but it also seems unlikely that it would have no other effect on the body.
Edit: the consultant I saw said I probably wouldn't get diagnosed 20 years ago when she started. I had trouble convincing my doctor it could be this condition. I'd never have known about it unless I'd mentioned it to my mum who told me my sister (who I don't have contact with) has this quite severely and it took her 10 years to get a diagnosis and access to helpful drugs.