It's worth noting that despite not providing Vitamin D, there's still a fair bit of value in getting sunlight through glass:
1. Seasonal Affective Disorder is treated using just the visible light spectrum entering the retinas of the eyes [1], and there are a ton of studies showing its effectiveness (in fact, most SAD lamps explicitly filter out ultraviolet light).
2. Some infrared passes through some glass and seems to be good for a thousand different things. [2] [3]
3. UVA passes through glass and has beneficial effects on the cardiovascular system. [4]
There is a lot of value in being able to see the outside. I can barely see outside from my position in the building and I feel it's really starting to eat my soul staring at walls the whole day.
Have you tried plants? I can't see windows from my desk, but I've got my cubicle filled with plants & it kind of makes up for it. One is a tropical shrub behind my monitors which would swallow my whole workstation if I didn't prune it back. Now if only people who seem to like the gloom would stop disconnecting bulbs in the fluorescent light fixtures..
There is evidence[1] that tropical plants and microorganisms in the soil will remove VOCs from sealed indoor environments, although the effect will likely be limited in an open plan office. At least they are nice to look at.
Oh... If only I could keep a plant alive in my office. I struggled with one for months, got it really healthy. Left instructions with co-worker and went to Europe for a month. While I was gone at least 5 people walked by that plant and said "oh, he's gone to Europe, I'd better water his plant for him". I get back and my plant is dead. My co-worker feels really, really bad about it and tells me he followed my interactions to the letter. It was over the course of 2 weeks that the other 4 people excitedly tell me that they watered my plant for me while I was gone, even though there was a little sign in it saying "please don't water me".
UVA is what causes skin cancers. Recommending or dismissing UVA exposure is irresponsible, especially in places with tons of UV radiation. Dermatologists go extra to teach people to choose sunscreen with good UVA protection, meanwhile Applied Science (one of my favourite youtubers) - “UVA penetrates regular window glass; hey its not to be worried about”...
Seems there's actually some debate on the net benefit/risk of this exposure [0]. Yes it raises risks for certain skin cancers but the survival rate of these is very high, meanwhile exposure decreases a collection of other risks whose death rates are much higher.
So I think the right course of action is a little more nuanced than simply saying, "UV increases skin cancer risk therefore ignore everything else as well as magnitudes of net risks/benefits". Should we not weigh all factors?
Actually, there was an article posted here (I think) not too long ago about a study that showed the positive benefits from sunlight were enough to outweigh the slightly increased chances of skin cancer.
Yeah I think it really varies by the region. Sure it's good to go outside in Sweden and England, but not in Aussie and NZ when you can get sunburn in about 15 minutes.
I've had couple of barbecue parties this summer in Auckland and NOBODY went outside until sun was way past 5PM. We preferred to sit in stuffy kitchen rather than enjoy beautiful beautiful outdoors. Even in middle of winter you can feel sun bite you as you stand in front of pedestrian crossing.
The sooner we completely phase out CFCs and other stratospheric ozone depleting gases, the sooner we can go back to the good old days where you could enjoy that sunshine.
The prediction is 2050-2070 we will be back to the 1980 level of ozone thickness.
Scientists have been trying to combat the perception that it is to blame for the high prevalence of skin cancer in Australia for a couple of decades now - it will never be a good idea to 'enjoy that sunshine' the way people used to.
(2001 paper - https://pdfs.semanticscholar.org/6944/503469e4f779e99884e144...
" Sufferers of skin cancer today should more likely blame their affliction on skin type and sun exposure during their youth than any changes in ozone distributions over the last twenty years. Therefore it is safe to say that even without ozone
depletion, Australia would still have a very high rate of skin cancer.")
Can't agree more. It was pretty crazy in the 70s/80s.
You had people literally baking themselves in the sun with tanning oils. Even after "Slip Slop Slap" was first promoted in the early 80s everyone was still trying to get a tan.
Over Antarctica, yes. But in some conditions (right winds? I forget) thinner patches of ozone do blow over the south coast of Australia, and does increase UV exposure.
> Seasonal affective disorder is a mood disorder subset in which people who have normal mental health throughout most of the year exhibit depressive symptoms at the same time each year, most commonly in the winter.
I usually get depressed in the summer though, even when I get enough unfiltered sunlight.
There are all kind of things that happen seasonally that might dampen your mood (for instance people taking vacations, students being home from school, work being more/less busy, outdoor exercise options changing, etc). Your depression might not have anything to do with that and just be a random variation that happened to occur in the summer.
Most depression has nothing to do with SAD; it's merely a specific type. It's a type that has a fairly straightforward treatment, thankfully!
I often have a depressed mood in November and December when the sun is at its lowest, but my family also likes to make "the holidays" as miserable an experience as possible, so it's really no surprise and probably unrelated to light levels.
FWIW, I have the same symptoms in winter, but none of the reasons you list apply to me. I can easily see SAD being caused by lack of natural light, and I try to spend as much time as possible in sunlight (which is hard given a normal work schedule).
"most commonly", but there is some evidence that SAD is not triggered only by amount of light:
"This model also explains the otherwise confusing tendency of some SAD sufferers to get depressed in the summer. The problem isn’t amount of light, it’s circadian rhythm disruption – which summer can do just as well as winter can."
"With a band gap of 4eV, glass can't absorb any photons with less energy than UVB light; namely, it is transparent to UVA, visible light, infared, etc; but the higher energy photons can and are highly likely to be absorbed."
So it seems hard to create glass that doesn't block UVB.
Feynman knew this and claims to be the only person to have watched the Trinity test with naked eyes, rather than through welding goggles:
> In Surely You’re Joking, Mr. Feynman, American physicist Richard Feynman speculates that he may have been the only person who watched the Trinity Test relatively directly, using a windshield to exclude ultraviolet light. Everyone else, he claims, was looking through something akin to welding goggles.
Lasers didn't exist yet to force us to study retinal exposure to bright non-UV light and the flash from the bomb didn't last that long, nor were the first few bombs that bright, so he may have been fine... but obviously if you stare at the sun through three inches of glass you're still going to burn your retinas.
He says "I figured the only thing that could really hurt your eyes (bright light can never hurt your eyes) is ultraviolet light. I got behind a truck windshield, because the ultraviolet can’t go through glass, so that would be safe, and so I could see the damn thing."
What we know from later laser research is that unless you have a comprehensive visual field test, it's often hard to identify that part of your retina has been scorched. Your brain just filters it out as a blind-spot and you don't realize what you're failing to see.
There's every chance that Feynman totally burned a section of his retina and never realized it, and there's every chance that he was fine because the exposure at his distance wasn't that bad, but at the end of the day he was more reckless than insightful in this situation.
Maybe he should have closed one eye, but the first atomic bomb only explodes once. Your body won't last forever no matter how well you care for it (as Feynman well knew, having just watched his wife Arline die slowly of incurable tuberculosis); seeing the first atomic-bomb test with your own eyes seems eminently worth the risk of blindness, or even sacrificing an eye.
I really can't agree. I would not exchange my eyesight for the whole world, literally. Therefore the qudos of having looked at a novel explosion in no way compensates for any risk to my eyes.
Downvoted not because I’m afraid of my own mortality, but because pointing out the obvious fact that nobody can see after they die is neither insightful nor useful. Go tell that to someone who has become blind and let me know how much consolation that provides them.
Even if I knew was going to die tomorrow, I will not exchange my remaining eyesight for qudos. I would prefer to spend my last day looking at my family, the trees, the running water, the flowers. It seems totally uncontroversial to me, I'm really surprised to find someone who disagrees.
Sorry, no, I just meant that the maximum human lifespan recorded so far is 12.2 decades, and the vast majority of people reading this will die in even less, 2–5 decades, regardless of what happens with climate change. Given that, it's silly to treat your body as if you could make it last forever.
Yep, he basically says that he knew the only thing that would damage his eyes would be UV-B, so he just went for it. Long time since I read the book but that's what I remember.
I don't get this type of thinking. Unless he thought it would be worse to wear the googles what is to be gained by doing something like wearing goggles (in that situation) just in case you were wrong? Why not reduce the chance of harm as much as you can?
I think this also has to be considered in the context of an ongoing cataclysmic war encompassing the world where whole cities were being destroyed and of course they were developing a weapon to destroy them faster. Today, WWII is that long ago thing that lasted for a few years and then it was over. Soon the veterans and the Holocaust survivors will all be dead. To anyone then, a lot of things probably didn't seem as important. A lot of people reacted to the atomic bomb once it was public as the impending end of the world, too. So the scientists who knew about it first probably had their attitudes affected.
The window didn't block a lot of other high power radiation, and he died from cancer 35 years later, though the connection is not scientifically certain.
So in case anyone just panicked thinking “Wait, so do my car and office windows block cancer producing UV or not??”, this is what I found on cancercouncil.com.au (first google result):
- UVA penetrates deeply into the skin (the dermis) causing genetic damage to cells, photo-ageing (wrinkling, blotchiness etc) and immune-suppression.
- UVB penetrates into the epidermis (top layer of the skin) causing damage to the cells. UVB is responsible for sunburn – a significant risk factor for skin cancer, especially melanoma.
Which contrary to what I knew, links melanoma to sunburn, not DNA damage.
Melanoma is definitely due to DNA damage; somatic mutation is the principal cause of all cancer. In this case the cause is either direct damage or damage by free radical byproducts created by UV. UV causes a kind of mutation in DNA called a pyrimidine dimer, where two adjacent bases mutate at once. By far the most common mutation in melanoma (reponsible for 50% of cases) is a CC to TT mutation at position 600 of the gene BRAF.
The UV damages the DNA. That step is usually required to produce melanoma cancer. The sunburn causes the deeper cells with damaged DNA to multiply in order to replace the damaged cells.
A single quiescent skin cell with precancerous DNA can be cleaned up by the immune system. A precancerous cell that has multiplied itself to cover a patch of sunburn, activating some of the genes for rapid growth, is much harder to clean up.
UVB causes melanoma, that is also a form of cancer. So typical glass would prevent sunburn and melanomas from forming but not the deep genetic and tissue damage that UVA causes. It is not a bad idea to apply SPF moisturizer before going out if you plan to spend anytime with the sun shining on you.
> So in case anyone just panicked thinking “Wait, so do my car and office windows block cancer producing UV or not??”
I just figured automotive glass doesn't block UV (or at least all of it) since window tinting places always advertise UV blocking as a feature of their films. Cynically I know it could be just empty marketing, but it didn't seem like it.
Of course, ordinary light, while still less energetic, penetrates still more deeply into skin, and it, too causes DNA damage. In situ DNA is damaged by light absorbed by less transparent molecules it is near.
Sapphire is an incredibly trivial compound, aluminum oxide.
The problem is, its melting point is huge. To make artificial sapphire you have to melt the stuff at very high temperature and let it drop and collect onto a ceramic base or something. And then you cut, grind and polish it into shape, which is not easy either because it's an extremely hard material.
The challenges to making a large flat plate of AlOx would be huge.
Nikon released a lens in 1984 that offered transmission and correction from UV to IR -- from ultraviolet, through visible light, to infrared (near IR, not heat).
This product was originally announced in 1984 as the Nikon 105mm f/4.5 UV-Micro-Nikkor, and from September 1985 it was marketed as the Nikon UV-Nikkor, then the lens sold then for $2,200.00 USD, then about half the cost of a full sized car.
Mildly amusing related aside from this. Because of this, glass is completely opaque in the UV spectrum. Looking at the world in UV [1] is quite interesting! Such a reminder that how and what we experience of the world is so largely a product of our physiological composition. What is perfectly transparent to us would be a great hiding place from the perspective of something that only saw in UV.
My grandfather, who was an ophthalmologist with a passion for inventing, has shown me schematics for this kind of glass. His idea was that you could put it in places that are sunny but not warm (like a mountain hotel) to allow for indoor sunbathing.
We always make fun of him for this idea as it's one of his strangest. I don't think he ever finished filing the patent.
> So it seems hard to create glass that doesn't block UVB.
Physics aside, why would you want to do that in the first place? UVB is the chief cause of skin reddening and sunburn and plays a key role in the development of skin cancer and a contributory role in tanning and photoaging. [1]
Just for that little benefit of triggering Vitamin D synthesis is not worth the increased risk of skin cancer IMO. And the author lays out the alternative there too: "Those concerned about low vitamin D levels can get more of the vitamin through foods. "
There is an alternative view that regular sun exposure is beneficial, and that many of the health benefits linked to vitamin D are actually just using vitamin D as a proxy for sun exposure. If this is the case, it's plausible that allowing for more UVB exposure indoors would be a net benefit.
I don't doubt that sun exposure is a contributing factor to skin cancer, but it just does not make sense that skin cancer rates have shot up over the past 100 years, while at the same time people spend less and less time outside.
IMO, there's probably some other causal factor(s), and reducing sunlight exposure is not the solution.
Sunlight exposure is important for health, not just for Vitamin D (which others have pointed out, may just be a proxy for some other factor of sun exposure). It's important for regulating circadian rhythm, as well as preventing myopia in childhood.
If you take a dive on PubMed, you'll see that indeed, sunlight exposure appears to be inversely correlated with all-cause mortality.
Why? It is not known, but even when controlling for physical exercise, Vitamin D status, and other factors, the correlation still holds. Some authors suggest other chemicals produced by sunlight, not only Vitamin F, might be involved.
> Physics aside, why would you want to do that in the first place?
I worked on a product that had a UV sensor. It needed to be protected.
Sourcing glass that didn't block UVB, that could be used in a mass market product, at cost, integrated into a manufacturing line, was a bit of a challenge. The mechanical engineering team eventually got a hold of some. For awhile, there were weekly status updates of "got another manufacturing sample, spec sheet wasn't quite honest, it blocks some UVB."
The answer to all such questions is - something related to energy levels of electrons in that material. If there's a resonance somewhere, photons at that energy get absorbed.
You can coat it - sunglasses are, for example. The coating can be invisible to the naked eye too, I think.
This is one of those situations where I think government intervention is needed. I bet the long-term benefits of coating glass like this are very real - both for society and individually (especially in professions that involve a lot of driving time). However, the short-term economic incentives work against it - there is probably a strong first-move disadvantage. Also, what is the economic benefit to a landlord to have UV-proofed glass for their tenants?
But if government were to implement a policy of requiring glass in cars and buildings to be coated like that? That levels the playing field. I doubt it is going to happen any time soon though.
But of we were ever going to to do that, I know for a fact that are also coatings with reflective layers (invisible to us) that tell birds that the glass is there, which would also save a lot of wildlife.
Ah, right. I somehow thought GP suggested near the end that it seems to be hard to fabricate glass that blocks UVA, and answered the question of how to deal with that.
Ah yes, the classic "I just assert that governments can only intervene with wrong solutions, without in any way engaging with the topic at hand and actually arguing why it is a wrong solution."
Was your supplement dosage enough though? In the order of 10000 IU per day?
The problem with vitamin D was that up until about a year ago, official recommendations in many countries were mistakenly low (500-1000 IUs), which as it turned out, was not enough for many people. It was even hard to find proper pills that has high enough dosages, because when you got to your local farmacist which was going by the official recommendations, all they had were 200-500 IU pills, because they were not expecting that anyone would need more. So as an average citizen who does not do their research properly, if they just went into a farmacist and took the first Vitamin D supplement they saw, the chances were that they were getting one which had basically no effect.
When the dosage is right, many people experience positive effects and dormant Vitamin D levels really do go up. The sun is not required.
The problem with Vit D3 supplement is that you don’t absorb most of it. It is well known, than the bioavailability of the supplements is not the same and that even the molecule itself is not the same as the one your organism produces.
Also, the science is starting to question the supplementation. It’s too early to tell, but one hypothesis is that it is a biomarker at the end of a pathway and not at the beginning of it, i.e. as a consequence of a proper diet and sunlight, and not a cause of a healthy system.
So the vitamin D might be there due to a healthy system instead of the system being healthy due to vitamin D.
An analogy would be the analysis of car exhaust, finding certain levels of CO2, and trying to fix cars with "bad exhaust" by throwing more CO2 into the intake (except maybe this would even have a negative effect by hindering combustion).
> An analogy would be the analysis of car exhaust, finding certain levels of CO2, and trying to fix cars with "bad exhaust" by throwing more CO2 into the intake (except maybe this would even have a negative effect by hindering combustion).
This is an hilarious (and fitting) analogy for nutrition sciences.
There was a classic TV ad by an oil company that said that CO2 is great because it is plant food. It's still an argument used by folks who oppose CO2 controls.
> It is well known, than the bioavailability of the supplements is not the same and that even the molecule itself is not the same as the one your organism produces.
Reputed and trustworthy citations from scientific sources required, please.
Every cholecalciferol molecule is the same. This is chemistry. Chemicals that are not the same cannot have the same chemical name. Every animal (without a genetic defect specific to the precursor chemicals) produces the same 7-dehydrocholesterol molecules, and every 7-dehydrocholesterol will, under UVB light, convert to cholecalciferol.
Industrial production of vitamin D3 irradiates the 7-dehydrocholesterol extracted from sheep's lanolin with UVB light.
There is some question as to whether ergocalciferol--a chemical mainly found in fungi that produce ergosterol and have been exposed to UVB light--is biologically equivalent to cholecalciferol in humans. It can alleviate vitamin D deficiency symptoms, but it is not known with certainty whether it can produce a sufficiency. As far as I am aware, the known cases of hypervitaminosis D have resulted from ergocalciferol supplementation, rather than from cholecalciferol.
If you are supplementing with vitamin D3, the chemical you are consuming is identical to that produced in your skin under UVB irradiation. There is no evidence whatsoever that it is destroyed in or poorly absorbed by the human digestive system. If you swallow 15000 IU of vitamin D3, that is the equivalent of standing shirtless in temperate midday sun for 15 minutes, after which time you will achieve no further benefit until some time has been spent absorbing the cholecalciferol and replenishing the 7-dehydrocholesterol in your skin.
It is poorly known, that if you have to preface a statement with "it is well known", what follows is less likely to be "known" than "unattributably rumored".
"It is well known" is a weasel word, and hence requires more substantiation when asked. If one has to rely on Googling on these matters, one can find so many contradictory views. So I wanted to know what reputed sources the GP was relying on to use "It is well known" for that claim.
Oh by the way sunscreen is terrible, it doesn't have a proven link to reducing cancer and it destroys reefs so do yourself a favor and get more sun without all the slimy goop so you can feel better AND save the planet!
Do you have a source for the 10000 IU per day guideline? If that's D3, that's 25 times to "recommended" amount, well above what is considered safe for long-term use. Now, I realize that our understanding of these things are in a state of (often violent) flux, but I would like to know who's recommended such a high amount. The problem with Vitamin D is that it is fat-soluble, not water-soluble, so your body doesn't expel the excess very quickly and it builds up. Too much can be harmful, particularly to your bones and kidneys[1].
> well above what is considered safe for long-term use.
The Mayo clinic article says:
> Taking 60,000 international units (IU) a day of vitamin D for several months has been shown to cause toxicity.
So 10,000, while it's well above the "recommended" dosage, is nowhere near the toxic level. Although when I was looking around the toxic level was 20,000, still, I've seen no guidance regarding Vitamin D that suggests 10,000 IUs/day can be toxic. (Especially if your BMI is over 25, and you're not getting much sunlight.)
Mayo clinic article concerns excessive supplementation, isolated on vitamin D, specifically.
A hypothesis exists that increasing vitamin K consumption in proportion to the vitamin D, and restricting calcium intake, would eliminate the main symptoms of excessive vitamin D supplementation. Nonhuman animal experimentation confirms, but it has not been tested in humans (likely due to the obvious ethical concerns).
Vitamin D, vitamin K, and calcium are all related.
I have seen guidance specifically for vitamin D2 (ergocalciferol), indicating that adverse symptoms may be observed at lower levels of supplementation than for vitamin D3 (cholecalciferol), but it was quite some time ago, and I don't know if the claim was confirmed or refuted scientifically.
I know you have faith that taking large doses of vitamin d isn't dangerous, but there is very little evidence supporting its value. We all have to make serious decisions every day without scientific support, so I don't begrudge you the right to do what you believe is best for you. But don't you think it's a little irresponsible to try to convince people that this belief of yours is actually effective? AFICT, there's as good a change that your advice will take ten years off of a life as it has of improving life. And you want to push this on others?
That's a good read. He says you can go high (10000 IU/day) for 6-10 months, under doctor's supervision, then cut back down 2000-4000/day. This also makes sense from the perspective that it's fat-soluble, so you should be able to "fill up the tank" as he puts it.
Thankfully it was posted to HN so I could find it. Google search has a truly become useless for this stuff. It's early in the morning, but I think I'm reading the abstract right. [1]
To add some anecdotal info, those numbers are much more closer to my own personal experience. I've been experimenting with various dosages, with regular blood tests by a doctor, for a few years now. Currently I've setteled at 4000 IU/day to stay at 75 nmol/L, which is the lower end of what is considered normal.
Personally, I've been taking 5000 IUs/day and recently started stepping that closer to 10,000 IUs. I use the liquid drops which are in oil, and I always try to take them with some sort of meal. The liquid is a lot cheaper and seems like the ideal way to absorb it to me.
This is in cloudy wintertime Seattle. In sunny times it may be completely unnecessary.
Just another anecdote, a friend of mine with Lyme Disease has serious problems absorbing vitamin D (and other nutrients) via food, his doctor has him on 10K per day.
That is interesting. I had read that 500 was ineffective but did not know that 10000 was effective. If you've done the research what is the most reliable source you found
?
taking vitamin D at such a high dose can give you health problems such as kidney stones. When I took 5000 IUs for a period of time I started getting painful kidneys just from the urinary effects.
Now I try to get my vitamin D through more normal sources by eating lots of UV-exposed mushrooms, salmon, and sardines. Plus side is these foods are all quite healthy even without the vitamin D
OP would still likely be fixing their problem though, right? That article says it is the sun exposure itself that has a positive effect, and vitamin D is the byproduct.
To be clear, that's a really useful article, just want to clarify that your intent is indeed to explain why vitamin D wouldn't work but a UVB lamp would.
Another possibility: higher vitamin D levels are associated with health because people who are outside more might be getting more exercise, and it's really the exercise that is doing it.
Vitamin D is something we can measure and often serves as an indicator, but large scale tests have shown limited effectiveness of supplements at resolving the issues a Vitamin D deficiency causes. e.g. We increase blood Vitamin D, but there are unknown correlative mechanisms that we aren't substituting.
The evidence on this seems to be changing frequently, not unlike the evidence on whether taking vitamin supplements is a good idea (https://www.health.harvard.edu/staying-healthy/should-you-ge...). Generally speaking, the latest science on nutrition seems to be more subject to later revision, or even reversal, than the latest science on, say, physics or geology.
So, while we wait (perhaps decades) for nutrition to become a more reliable science, I like to consider it from an evolutionary angle. That which is very dissimilar to what my ancestors experienced, like plentiful sugar or a sedentary lifestyle, is more likely to be bad for my health.
Exposure to the sun for hours a day, including the UV part, was part of our ancestors' lifestyle from the beginnings of our species, until a few decades ago. Now it is possible that this is the exception, and we should keep ourselves shielded from UV at all times even though it was what we evolved for. But...
- we don't now much about our skin microbiome, but we know we have one and it matters, and it seems unlikely that removing all UV would not impact that
- we know there's an impact on vitamin D levels, but the details of that are still unknown
- we know that eczema, psoriasis, and other skin conditions have been rising in recent decades, since the "always wear sunscreen" advice became common
Now, many that's all a coincidence, and UV is bad for us, and lack of UV is good for us, regardless of the fact that it's the environment we evolved for. But I'm skeptical.
There's another important factor to consider, and that's how we expose ourselves to UV. Our ancestors used to be outside most of the day, allowing skin pigmentation to adapt to the regional and seasonal UV levels. Until agriculture came along fairly recently we also lived mostly in the shade of the forest and not in direct sunlight.
Compare that to today, where UV exposure mostly means exposing the skin to massive bursts of UV for short times, at best in an effort to build skin pigmentation that isn't sufficient. That's completely different and much more damaging, leading to the general feeling that any UV is bad.
I think our ancestors lived in the full sun of the African savannah a lot more than the forest, and even in northern Europe, hunter gatherers were killing meat-on-the-hoof (mammoth, wild horse, etc.) so they were running about in the open a lot.
However, the change in exposure is surely true. I'm just not at all convinced that complete shielding from UV is a better choice than occasional UV.
Until recently, children would run around outside a lot during the summer, even in urban environments, and especially in suburban or rural ones.
Be careful of the natural fallacy. Life expectancy was much lower a long time ago, and causes of death from poor health were not recorded very accurately.
Choosing the natural over the artificial is a reasonable default where the science is uncertain. But just remember that science has eliminated an awful lot of problems.
Also remember that natural experiences aren't always a la carte. Picking natural foods from disperate regions might be very unnatural, as is getting natural UV light after taking a shower.
Certainly, many natural things are not good. However, the idea that a complete absence of natural light (which includes UV) is good, is suspect. It reminds one of the idea, in the early 20th century, that getting rid of all bacteria was good. It turns out that there are way more kinds of bacteria than we know, and an excess of antibiotics and disinfectants is almost certainly a big part of why we have ever-escalating rates of severe allergies and auto-immune disorders.
Not that we would wish to have a complete absence of antibiotics. But, the attitude of "you have a virus, so take this antibiotic just in case there's also a bacteria", sounds a lot like "block all UV light from hitting your skin, at all times". I am skeptical.
I mostly agree. I'm just saying that you might want to stop using soap as well ;-)
Using soap has a big effect on how the UV light hits your skin, the damage it might cause, and the benefits that it might have. Of course this is all speculation on my part, but I am just warning of the idea of picking one natural thing at a time -- natural things go together.
Maybe instead of sunscreen, we should be using mudpacks. :) I think it might hurt my chances of getting employed, unless I worked only remotely. Also, my wife would have words.
However, anecdotally, the "mud beggars" at modern Renaissance Faires are said to have healthy skin. It could be a confusion of cause and effect, though.
Do consider that your ancestors developed in a particular latitude in a particular climate. E.g. anglo-saxons developed in northern latitudes & have pale skin. The fact that they spent many hours a day in the sun unprotected does not mean that an anglo-saxon would be advised to do the same in Phoenix.
Absolutely. I do think that the typical amount of sun which a descendant of northern Europeans, now living in Phoenix, gets is probably still not as much as their ancestors got, simply because the amount of time spent outside is so much less. But your point is well taken.
I think there's a business opportunity for someone to make light bulbs which emit a little UVB. Not enough to cause sunburn or raise skin cancer risk, just enough to keep our vitamin D levels up. That way you could just stick it in a light fixture and go about your normal life. Is such a light practical to build?
There are plenty of tanning (UVB) lights at all different wattages available for purchase.
I think the problem is the visible light output tends to be harsh, and the bulbs are highly inefficient, they are exensive, and they don’t last very long either.
So there would be a lot of problems to solve, not the least of which is the cultural issue you raise of scaremongering around UVB exposure in the first place.
There is one FDA approved vitamin D lamp made by Sperti.com. Costs about $400, has a 5 minute timer, instructions say to use it once every other day, the UVB spectrum of the lamps is tailored to Vitamin D production. I have one, it works better than pills.
Tanning beds are banned in some countries with high skin cancer rates for good reason. It is a lottery and you are accumulating risk with exposure. I strongly suspect (without any real evidence) that there are a number of beneficial health effects to moderate UV exposure that can improve quality of life for some people but the experience with fair skinned populations in high UV parts of the world has been overwhelmingly negative. The public health advice is basically written in blood on this topic and so I would view people and sites promoting alternative views with an appropriate amount of skepticism for now.
I agree that food doesn’t have enough vitamin D, but supplements really made the difference for me. Just get the right dosage (which is probably higher that you think).
Were you measuring in observed serum vitamin D levels, or just in subjective wellbeing? Light therapy fixes Seasonal Affective Disorder but doesn’t necessarily do so by increasing vitamin D production.
Could I ask what lamp you chose and how long you spend in front of it? I take a fairly high dose of D3 and I'd be curious to try artificial light. Thanks!
I never use it for more than 1 minute and I alternate between facing away and towards it. For the first month I used it every day, now more like every other day or every two days.
Tangentially related, I got an androv full-spectrum lamp several years ago for some color work. It was one of the few true "full spectrum" bulbs you can get for cheap which are decently balanced without resorting to use a combination of bulbs. And yes, we tested it with a spectrometer.
Most other full-spectrum lamps I tried at the time on amazon didn't get even close.
If you never tried, working with these lamps is weird: turning it on in the evening or night by mistake _really_ wakes you up.
>... now more like every other day or every two days.
Can I ask the difference? I'm assuming a typo, but haven't experienced sun shortage for at least six years, so I'm not sure on boom / bust cycle potential with these things.
A few applications à 15 min with a mercury high pressure lamp with a good reflector and you look like you went to the Caribbean for holidays. Maybe add a filter for UV-C, but there is no ionizing radiation. Just show any lamp in question to your favorite dermatologist. If he seems unhappy, you have the chosen the right lamp.
Jokes aside, skin is very photo-active and I do think lamps can help greatly against these kind deficiencies.
Most lamps emit a line spectrum though, so it is not like real sunlight. I wonder if that is the reason people going to solaria often seem to have weird skin tones. Ask a dermatologist and don't forget eye protection.
Apart from that it has the same side effects as sun exposure. Too much leads to skin aging, sunburn, cancer, etc.
Those side effects can be mitigated or at least reduced by adding extra infrared or red light. Eg sunburns are prevented or greatly reduced by treating with near infrared either before or after UV exposure.
I dabbled in light therapy and found the first lamp I tried [1] very effective. However I have also started receiving comments from my colleagues about my nice new facial tan, in the dead of winter. I freaked out because the lamp was in my field of view 30 minutes per day every day without any eye protection, and sent the lamp back to where I bought it. Subsequently I have tried three different LED-based lamps and they seem to have no effect. I have carefully measured the light output and placed myself at proper distance to get the requisite 10,000 lux in all cases. No dice.
Here's the kicker - that lamp [1] appears to be the one used in some of the light therapy research (that's why I got it). It is entirely possible that much of light-therapy research was contaminated by badly designed lamps. There is no FDA certification for any of this as far as I know.
It only blocks most of the UV. The lamp itself is blasting a very intense amount of light, so that the circadian-sensitive mechanisms in your eyes are fooled into thinking you're outside on a bright summer day. The vitamin D is secondary to the main function of the lamp, which is to be incredibly bright in comparison to most indoor lights. The light-therapy lamp must be aimed at the eyes (even for blind people!). It just doesn't work as well if you can't see it directly. You would have to pump even more light into the room to get the same amount reflected off of other surfaces.
Also, the lamp could be putting out UVA and not much UVB, so it could tan without producing much vitamin D.
For just the vitamin D, I'd get a "5.0" UVB lamp in a bare-bulb fixture, designed and sold for indoor avian and reptile pets. Aim at arms and hands, and away from the eyes. They're about $15 per bulb, and should be replaced every 6 months, as the UVB output degrades over time. Afterward, they work perfectly fine as compact fluorescents, behind glass or under lampshades, for several more months, albeit shaped inconveniently for some fixtures.
A "10.0" bulb is for desert-species pets. The UV output is too high for the overall lamp brightness, and may damage your eyesight, as human pupils don't constrict enough. I suppose you could wear sunglasses, but that seems counterproductive to just buying a lower-UV lamp.
Magnesium deficiency and a general mineral imbalance that resulted in severe insomnia. I also developed a bunch of food intolerances that were getting ever worse.
I was supplementing with a lot of magnesium. Sleep returned very quickly, my need for supplementing magnesium also dropped in days.
After reading the "Big Vitamin D Mistake" article on HN in 2017 I also started experimenting with larger doses of Vitamin D and can attest to life-changing effects.
It has been like a permanent +5 to constitution; I am now almost through my second winter without getting sick, something inconceivable for me as a kid growing up with asthma and a whole range of puffers, and always prone to debilitating sinus infections that told hold in January and never really let up sometime around April.
I initially started with 1000 IU of Vitamin D3, Costco brand, and eventually settled on 5000 as I noticed improvements in energy level as I took more.
I am at 43 latitude so I only start taking it in mid-November and start weening myself off it in late March (you never really know if/when spring will happen up here in Toronto :)
Couldn't it be that this depends on the person? Maybe others can use the vitamin D present in food? I.e. part of your deficit might be because your body somehow cannot process the D in food? Just guessing here..
Narrowband uvb is a very effective treatment for some skin conditions. The mechanism is totally different to vitamin D and oral supplementation basically has no effect. UV exposure is a significant health risk in some parts of the world and the public information campaigns to reduce exposure seem well justified. But I wonder if there isn't a case for a narrowband uvb sunscreen that only lets those dangerous but highly effective dna mutating rays through along with some sort of cheap and reliable dosimetry.
I live in Southern California and it's common for people to have these effects when moving here from darker locales and spending time outside. My severe insomnia almost vanished after coming here.
It's not about us, UV light destroys almost all types of plastic materials, and affects color of wood and fabrics - so without glass UV filtering our furniture, floors and pretty much everything else inside our houses and cars that is exposed to sunlight would last a lot shorter.
It's been 500 years since Paracelsus supposedly said something along the lines of: "All things are poison, and nothing is without poison, the dosage alone makes it so a thing is not a poison."
Evolution will happily incorporate an abundant resource into one biological process even if it is damaging to another. Heck, even if it is damaging to the same process it is being incorporated into.
The heat we produce in oil motors is what makes them work, but it's also what destroys many components. I don't pretend life has been designed or not, but it doesn't look stupid to me: in a relative world, everything is a compromise.
It's far worse than the trade-offs required by physics.
Evolution is short-sighted in a way that is hard to overstate. It exclusively cares about the current environment, it can only consider alternatives when they have actually been implemented, it can't look-ahead to see if a chain of changes is really desirable, and it can't even look-behind to determine if a previous design is better-suited to the current environment.
A better analogy to the way evolution operates using an engine would be: imagine that you first develop an engine that runs on a pure fuel, but then for a long span of time you can only get an inferior, highly contaminated fuel that produces a corrosive acid while it burns that slowly destroys the engine from the inside. You make a few tweaks to mitigate (but not stop) the damage from the acidic residue, but then you discover that by adding an additional afterburner with a special additive, you can get an extra 8% output from the engine. But then later when the pure fuel is available again, you discover that running the engine with the afterburner on a pure fuel will make it explode in short order. It's too late to redesign the engine without the afterburner -- most of the engines currently in existence suffer from the problem -- but it's pretty easy to re-contaminate the pure fuel so that it continues to produce the acidic residue that is both destroying the engines and expected by the afterburners. Meanwhile, most people still don't want to buy a pure-fuel engine without the afterburners since they have a lower output, and you really don't need most engines to last forever anyway.
That's the sort of bullshit you deal with in an evolved system.
> Evolution is short-sighted in a way that is hard to overstate. It exclusively cares about the current environment
This isn't quite accurate; while some evolutionary pressures are indeed exclusively short-term, not all are.
For instance, ability of eukaryotes to have multiple alleles of individual genes combined with sexual reproduction allow populations to maintain a large diversity of alleles and genes, which have the effect of allowing large-scale "split testing" and also "alternatives" for different current environments that may appear over time.
It seems like you'd like for evolution to have some model, so it can test old designs in the current environment, and simulate forward the current design into future environments, without needing to burn an organism and time on the test. But how could that be possible?
By evolving intelligence that creates computers and models and does that exact simulation, and then modifies the necessary biology to apply the best scenario. Seems to me we’re well on our way there.
Well, evolution doesn't care if you die at 50 from sun induced cancer or at 80 from heart failure, as long as you've pumped out offsprings capable of reproduction you succeeded.
Plus nowadays with tech / medical advances evolution doesn't play the same role as 10k years ago. You can be obese, diabetic, lactose intolerant, lose a kidney and a lung and still get kids + your life expectancy will be miles ahead of our ancestors hunting antelopes with spears and pointy stones.
This is a commonly-held misunderstanding! Natural selection doesn't stop acting once you've reproduced.
Characteristics ("traits") which are potentially not helpful for direct reproductive success but have an impact on the success of the group of people you're a part of are conserved through evolution. However, the exact formulation of this idea (known as kin selection or group selection) is still quite controversial [1] among evolutionary biologists.
Sure, it depends at which scale you observe it. It's probably impossible to study individual traits vs "group" traits though. As I see it it's one of these problems with so many co-founding variables that we'll never agree (depending on who you listen to: fat is good/bad, carbs are good/bad, sun is good/bad, ad infinitum)
My point was, something can be good at some point (vitamin D production) and turn bad in the long term (increased cancer rate), but being ultimately bad doesn't mean evolution failed.
Maybe the intelligence behind the design was to make something that works, instead of something perfect? Isn't that better, after all?
(Also is there even any indication anywhere that a perfect thing is possible at all, whether be by design or a result of random chaotic process?)
(again, not saying whether the design might be intelligent or not, just that the argument is not really logical concerning that).
> Apart from humans, most of the other menstruating animals are primates, the group that includes monkeys and apes as well as humans. Most monkeys living in Africa and Asia, such as rhesus macaques, menstruate.
> menstruation also evolved independently in two other groups: some bats and elephant shrews.
Not that any of this disproves the "evolution hackery" suggestion, mind.
Moderation may not suffice. I'd say Dihydrogen Monoxide is very dangerous given it's a colorless and odorless chemical compound so not always easy to identify for the layman. Yet we can spot it in great quantity in most food despite the fact it's PH is higher than most acids. You can't even wash it off at the sink, the more you do, the more you get in contact with it.
I mean at least UVB, you can protect against with simple glass. But something you find traces in most people's blood is no joke.
We should suppress it completly. Or at least ban it from schools.
What likely happened is, since humans started living on the ground after they left trees, they had enough sunlight exposure; so we evolved sunlight to be a start marker for some important metabolic activities, purely by randomness. It just so happens that too much sunlight exposure causes cancer. But since cancer happens later in life, it wasn't selected against and we just ended up with this weird biochemistry: UVB is necessary for health but it also causes cancer.
Don't downvote OP, it's a very legitimate question.
Medias tend to represent everything as black and white, and so it's easy to get confused, espacially since people will tell you everything then the contrary depending of the fashion of the day.
It's always nice to have those moments, when you can highlight that reality has nuances. Let's encourage people asking for answers, we don't have enough of them.
And yes, Vitamin D is required, not UVB. You can aquired vitamin D by eating organisms full of it. But the most efficient way is still to get exposed to sunlight. It's partly why some get depressed in the winter.
The sun is not "bad for the skin" any more than a cold water is bad for your health. But stay in it too long and you may die.
> And yes, Vitamin D is required, not UVB. You can aquired vitamin D by eating ...
There is some debate about this now, while sunlight certainly produces vitamin D, that's certainly not all that it does. This article showed up here recently:
Of course, but sunlight is not just UVB. Plus, the spectrum of one component is hardly what define light interraction. The many spectrums + quantity + intensity + the human receiving is a more interesting model.
E.G: I just bought a light to help with the winter blues. It does not provide any UVB, so the method of action is something else.
Worth saying that for fellow British users we won't be able to get any vitamin D from the Sun until mid March. We live at too high a latitude - the sun is too low down in the horizon.
On the other hand UV tends to lower your folate levels. Populations' skin colors tends to track the latitudes they live in but also the ratio of vitamin D to folate in their diets. That's part of the reason the pre-agrigultural population of Europe with their fish heavy diet had skin colors so much darker than modern Europeans
I've noticed plants do poorly behind windows at my office (thick glass, newer glass) despite being South facing (northern hemisphere). At home in a windowsill behind glass from the 60s they thrive.
Anyone had a similar experience? Think it could be because of the office glass?
I think it's more likely that they don't like the warm dry air produced by the radiators that are typically installed below the windows. Chlorophyll doesn't absorb much UV.
My newborn daughter's pediatrician recently told my wife and I how important it is to sit in front of the window with our daughter, specifically for vitamin D generation via sunlight exposure (as opposed to taking her outside right now because it's very cold/winter). I guess I (and perhaps all parents of winter babies?) should ask about a vitamin D supplement.
or just go outside with your baby. humans are pretty resilient, especially the rubbery little ones and they are really good at telling you when they are uncomfortable.
Great to know! I just started supplementing vitamin D this year in New England, and it seems like it's really helpful.
I think a big part of my point was about the resiliency of children. It's worrying that new parents are scared to experience the world with their children.
New parents are still getting up to speed on what is a real danger and what isn't. There are plenty of legitimate dangers that require consideration, as well as scary things that are in fact perfectly fine.
as a new-ish parent, i'd love to get my kids outside to soak up sunlight but it's nearly impossible to stay outside for any meaningful amount of time in the northeast winter.
I've seen parents schlep <1 year old babies up some of the White Mountains in mild winter conditions (snow on the ground, below 32F, etc). I think you are doing something wrong.
ignoring the glass - in the winter much of the US is at too high a lattitude to get any UVB, as it bounces off the atmosphere because the angle of attack is too shallow. The sun needs to be at least 50degrees above the horizon for UVB to penetrate the atmosphere. In an iowa winter, even at its apogee the sun is only ever 26deg above the horizon
You should definitely check out a Vitamin D supplement for your daughter. Where I am from we have reduced sunlight during winter months and it is something all the pediatricians have recommended and asked if we are doing because of the lack of sunlight.
The one thing I learned in college was that schools outside of my state are pretty bad with "common sense".
"Common sense" is highly localized. My schools had health studies for sure, but no home economics. I had classmates in college who had home economics, but not very robust health classes.
Any and all combinations of learning exist in the USA alone. Each State is in charge of its own education department. Travel just 2 or 3 states in any direction, and you'll have a grossly different set of skills being taught at the high-school level.
Yeah of course, there are many things that are widely believed but are false. But we were told this in a 30 minutes course a doctor was giving, in a campaign the schools were doing to educate us about the danger of the sun. A few other things were told, like that there is no total protection sun cream or that even cream that are marked as water resistant do not resist water. That's when we learned that 1mm of "modern glass" would cut nearly all uv light, while it requires nearly 50 meters of water to block it.
Of course not all that might be true, but as far as I know, it is pretty accurate.
As I mentioned in another comment I was aware of this fact. I would have been able to tell you as such if someone had asked, but Vitimin D would never have crossed my mind as being an issue.
Thought the same thing, I thought this was common knowledge. But I live in the upper midwest where low D levels are common in winter, so perhaps it's just discussed here more.
Also note: research on vitamin D has suggested that it’s not vitamin D that provides the benefits often touted, but the fact that you are outside getting exercise.
I get my vitamin D levels check every quarter because of my kidney failure. Based on my past experiments, I'd say if you are in the bay area, walking in the noon sun for 15 minutes every workday would be enough to keep healthy vitamin D levels. In terms of supplements, about 1500 IU/day is enough during winter season.
My personal experience with nutrition makes me consider the childhood years as the deciding factor in regards to living healthy without much time or resources allocated towards personal health. There are outliers, such as genetics and parenting will limit a child or not but I can only assume it’s a struggle starting unhealthy in the early to late 20s. All my childhoods friends that were into skateboarding, biking or another intense activity from sun rise to sunset cannot put on weight. Typically have healthy skin unless acne prone and are very coordinated with balancing. The reason this article compelled me to write is because the next generation of parenting is restricting kids from the freedom of outside activities unless a parent is there with the kids. There will be a lot of lacking factors besides vitamin D and it’s terrible to think about. I’ve been a person who has most of my life believed that deterrents should exist against parents who let there kids become unhealthy. It seems we’re heading the opposite direction.
Might you be confusing correlation and causality? Whatever caused your childhood friends to be into intense physical sports, might also be the cause of their adulthood health - perhaps a natural inclination to physical activity, or maybe an unusually fast metabolism that gives them lots of energy to burn.
Not to suggest that you're wrong - childhood exercise probably is very important, and is under threat.
So I'd maintain some skepticism on the product's efficacy as it related to health. (I tried finding other clearances related to the product and failed, but they might still be out there...)
"After adjusting for high levels of education, a well-established risk factor for myopia, they found that participants with the highest UVB exposure, especially in the teenage and young adult years, had about a 30% lower risk for myopia than those with the lowest exposure. The study also found no association with myopia and vitamin D levels or genetic variants in Vitamin D."
Be careful not to have vitamin D deficiency! I think that might have been one of things which contributed to my multiple sclerosis. I was sitting in front of pc most of the spring/summer, rarely spending time outside...
The article mentions commercial and automobile glass. What about forty year old double pane glass in an old house? On very cold winter days in Vermont I enjoy a kitchen bathed in sunlight. No benefit?
Common sense is a function of individual's level of knowledge. For vast majority of people, "common sense" would dictate answering this question "yes", even though in reality, the answer is "no".
I knew about this - and I think that if you have had a deficiency of Vitamin D your doctor might have told you about it.
But I'd wager to say the majority of the population that didn't consult a doctor due to a deficiency won't know this. (Unless, perhaps, when they studied something in the medical field).
I think it's one of those things that should be common sense but we don't generally think about it hard enough. By that I mean everyone probably could figure it out if they just gave it about 30 seconds of thought. I clearly remember the day I learned this back in high school. My physics teacher was discussing the refraction of light in materials like water and glass and the conversation moved into how materials can break up the components of light (like a prism). Another student was trying to make sense of this concept of light breaking down into frequencies and its component parts but still be light after being broken down. Forcing us to think about it ourselves, the teacher asked the class two questions:
1) Have you ever gotten a tan anywhere on your body on a long car drive in the summer? Everyone thought for a moment and answered that they hadn't even gotten so much as a farmer's tan on their arm to which he asked...
2) Why?
This forced us to think about it and realize that the window was doing more than just refraction and was stripping out or absorbing part of the light and even though it appeared as the same light to us, whatever was being removed must have been in the spectrum of light not visible to humans.
So we all intuitively know this, but never stop to actually think about it.
Doesn't seem like much intuition was had in your example, to be honest.
The teacher primed you with "Why have you never gotten burned on a long summer drive?" and then you bought that as fact and backsplained it with some basic prism experiments, the backsplanation not really mattering once you buy the fact from the teacher.
In other words, I'm not sure what 30 second thought process you think the average person can walk through to replace a teacher telling you the fact outright.
Once you know a fact, people can "intuit" any explanation. Just talk to children and listen to their bizarre theories -- but I would not call that intuition because they cannot arrive at the original fact that way.
Besides, my intuition is the opposite. My friend was a truck driver and said he'd get sunburned hands. He always kept sunscreen in the cabin.
My point was that once you learn the basic physics underlying this, you should be able to come to the proper conclusion about the vitamin D question easily enough. Every person going through the US school system is required to take a basic physics course, as far as I know. So every child who receives a HS diploma in the US should know this basic concept and, if they actually thought about it, should be able to figure it out.
I think you're confusing the underlying theory with us encountering the question on the day we learned it and then dismissing the fact that we were primed with the current conversation to know the answer. What I'm saying is that once we had that knowledge about the underlying physics, the same question(s) could have been posed weeks or months later (with no priming) and we should have been able to figure it out.
I, for one, can't really think of any daily experiences that would count as evidence for or against glass being transparent to UV. Maybe if long drives in sunny weather are a typical experience to someone, it'll be common sense to them, but I'm rarely in a car for longer than couple of hours at a time and I develop tans gradually enough I can't really notice it in span of days, let alone hours. My answer to your teacher's question 1 would be "no idea" since I wouldn't have paid attention to it in the first place.
Of course, given a premise "skin doesn't tan when sitting in a car", I think it's a good explanation for the subject.
So in a physics class, while learning about refraction, you were primed with a fact that you hadn’t personally noticed before and from that experience you deduce that this should be common sense?
It's not that we hadn't noticed it before. It's that we hadn't thought about it before. We all knew from experience that none of us had gotten any sort of tan through glass when asked. We just never thought to think about why. And before that day, we didn't know enough to answer the question. The fact that we were "primed" is just happenstance because the question came up during the learning process. Once we knew the physics (which all HS students in the US learn), the same question(s) could have been asked on another day when we weren't "primed" with the discussion and we would have been able to deduce the answer. For example - if that anecdotal evidence never came up the teacher could have put it on the final exam and we should have been able to figure it out. Again, my point is that anyone who went through the US school system and theoretically has the capability of answering the question if they actually thought about it. But most don't. That was my only point, really. It should be common sense, but isn't because no one actually gives it any consideration.
I've occasionally posed this question to people during casual conversation, and I've always received the same answer - 'of course!' (... you make vitamin D behind glass).
It's still not clear to me if there are some types of glass that do allow vitamin D synthesis or not.
I was told that TV would turn my eyes square as a child, looking forward to having it proved by science so I can smugly announce that I've known all along.
Yes, I asked my science teacher this exact question when I was 15 (as I burn easily) and was happily told I couldn't get sunburnt behind glass. So far, experiments have proven this true.
A similar fact is that photochromic lenses will not work in cars, i.e. when you're looking at the world through the windshield. The photochemistry in them is triggered by UV rays. It's another thing that should be common knowledge, but isn't.
Auto glass must be somehow different, because in Florida, you can surely get "some color" on your skin on a long drive and the sun beating in through the window.
1. Seasonal Affective Disorder is treated using just the visible light spectrum entering the retinas of the eyes [1], and there are a ton of studies showing its effectiveness (in fact, most SAD lamps explicitly filter out ultraviolet light).
2. Some infrared passes through some glass and seems to be good for a thousand different things. [2] [3]
3. UVA passes through glass and has beneficial effects on the cardiovascular system. [4]
[1] https://www.sciencedirect.com/science/article/pii/0273230092...
[2] https://valtsus.blogspot.com/2017/05/the-therapeutic-effects...
[3] https://www.quora.com/Can-infrared-light-pass-through-glass
[4] https://www.sciencedirect.com/science/article/pii/S0022202X1...