Eat to Beat Depression and Anxiety – by Dr. Drew Ramsey
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
Most of us could use a little mood boost sometimes, and some of us could definitely stand to have our baseline neurochemistry elevated a bit. We’ve probably Googled “foods to increase dopamine”, and similar phrases. So, why is this a book, and not just an article saying to eat cashews and dark chocolate?
Dr. Drew Ramsey takes a holistic approach to health. By this we mean that to have good health, the whole body and mind must be kept healthy. Let a part slip, and the others will soon follow. Improve a part, and the others will soon follow, too.
Of course, there is only so much that diet can do. Jut as no diet will replace a Type 1 Diabetic’s pancreas with a working one, no diet will treat the causes of some kinds of depression and anxiety.
For this reason, Dr. Ramsey, himself a psychiatrist (and a farmer!) recommends a combination of talking therapy and diet, with medications as a “third leg” to be included when necessary. The goal, for him, is to reduce dependence on medications, while still recognizing when they can be useful or even necessary.
As for the practical, actionable advices in the book, he does (unsurprisingly) recommend a Mediterranean diet. Heavy on the greens and beans, plenty of colorful fruit and veg, small amounts of fish and seafood, even smaller amounts of grass-fed beef and fermented dairy. He also discusses a bunch of “superfoods” he particularly recommends.
Nor does he just hand-wave the process; he talks about the science of how and why each of these things helps.
And in practical terms, he even devotes some time to helping the reader get our kitchen set up, if we’re not already ready-to-go in that department. He also caters to any “can’t cook / won’t cook” readers and how to work around that too.
Bottom line: if you’d like to get rewiring your brain (leveraging neuroplasticity is a key component of the book), this will get you on track. A particular strength is how the author “thinks of everything” in terms of common problems that people (especially: depressed and anxious people!) might have in implementing his advices.
Don’t Forget…
Did you arrive here from our newsletter? Don’t forget to return to the email to continue learning!
Recommended
Learn to Age Gracefully
Join the 98k+ American women taking control of their health & aging with our 100% free (and fun!) daily emails:
-
Simply The Pits: These Underarm Myths!
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
Are We Taking A Risk To Smell Fresh As A Daisy?
Yesterday, we asked you for your health-related view of underarm deodorants.
So, what does the science say?
They can cause (or increase risk of) cancer: True or False?
False, so far as we know. Obviously it’s very hard to prove a negative, but there is no credible evidence that deodorants cause cancer.
The belief that they do comes from old in vitro studies applying the deodorant directly to the cells in question, like this one with canine kidney tissues in petri dishes:
Antiperspirant Induced DNA Damage in Canine Cells by Comet Assay
Which means that if you’re not a dog and/or if you don’t spray it directly onto your internal organs, this study’s data doesn’t apply to you.
In contrast, more modern systematic safety reviews have found…
❝Neither is there clear evidence to show use of aluminum-containing underarm antiperspirants or cosmetics increases the risk of Alzheimer’s Disease or breast cancer.
Metallic aluminum, its oxides, and common aluminum salts have not been shown to be either genotoxic or carcinogenic.❞
(however, one safety risk it did find is that we should avoid eating it excessively while pregnant or breastfeeding)
Alternatives like deodorant rocks have fewer chemicals and thus are safer: True or False?
True and False, respectively. That is, they do have fewer chemicals, but cannot in scientific terms be qualifiably, let alone quantifiably, described as safer than a product that was already found to be safe.
Deodorant rocks are usually alum crystals, by the way; that is to say, aluminum salts of various kinds. So if it was aluminum you were hoping to avoid, it’s still there.
However, if you’re trying to cut down on extra chemicals, then yes, you will get very few in deodorant rocks, compared to the very many in spray-on or roll-on deodorants!
Soap and water is a safe, simple, and sufficient alternative: True or False?
True or False, depending on what you want as a result!
- If you care that your deodorant also functions as an antiperspirant, then no, soap and water will certainly not have an antiperspirant effect.
- If you care only about washing off bacteria and eliminating odor for the next little while, then yes, soap and water will work just fine.
Bonus myths:
There is no difference between men’s and women’s deodorants, apart from the marketing: True or False?
False! While to judge by the marketing, the only difference is that one smells of “evening lily” and the other smells of “chainsaw barbecue” or something, the real difference is…
- The “men’s” kind is designed to get past armpit hair and reach the skin without clogging the hair up.
- The “women’s” kind is designed to apply a light coating to the skin that helps avoid chafing and irritation.
In other words… If you are a woman with armpit hair or a man without, you might want to ignore the marketing and choose according to your grooming preferences.
Hopefully you can still find a fragrance that suits!
Shaving (or otherwise depilating) armpits is better for hygiene: True or False?
True or False, depending on what you consider “hygiene”.
Consistent with popular belief, shaving means there is less surface area for bacteria to live. And empirically speaking, that means a reduction in body odor:
However, shaving typically causes microabrasions, and while there’s no longer hair for the bacteria to enjoy, they now have access to the inside of your skin, something they didn’t have before. This can cause much more unpleasant problems in the long-run, for example:
❝Hidradenitis suppurativa is a chronic and debilitating skin disease, whose lesions can range from inflammatory nodules to abscesses and fistulas in the armpits, groin, perineum, inframammary region❞
Read more: Hidradenitis suppurativa: Basic considerations for its approach: A narrative review
And more: Hidradenitis suppurativa: Epidemiology, clinical presentation, and pathogenesis
If this seems a bit “damned if you do; damned if you don’t”, this writer’s preferred way of dodging both is to use electric clippers (the buzzy kind, as used for cutting short hair) to trim hers down low, and thus leave just a little soft fuzz.
What you do with yours is obviously up to you; our job here is just to give the information for everyone to make informed decisions whatever you choose 🙂
Take care!
Share This Post
-
Is stress turning my hair grey?
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
When we start to go grey depends a lot on genetics.
Your first grey hairs usually appear anywhere between your twenties and fifties. For men, grey hairs normally start at the temples and sideburns. Women tend to start greying on the hairline, especially at the front.
The most rapid greying usually happens between ages 50 and 60. But does anything we do speed up the process? And is there anything we can do to slow it down?
You’ve probably heard that plucking, dyeing and stress can make your hair go grey – and that redheads don’t. Here’s what the science says.
Oksana Klymenko/Shutterstock What gives hair its colour?
Each strand of hair is produced by a hair follicle, a tunnel-like opening in your skin. Follicles contain two different kinds of stem cells:
- keratinocytes, which produce keratin, the protein that makes and regenerates hair strands
- melanocytes, which produce melanin, the pigment that colours your hair and skin.
There are two main types of melanin that determine hair colour. Eumelanin is a black-brown pigment and pheomelanin is a red-yellow pigment.
The amount of the different pigments determines hair colour. Black and brown hair has mostly eumelanin, red hair has the most pheomelanin, and blonde hair has just a small amount of both.
So what makes our hair turn grey?
As we age, it’s normal for cells to become less active. In the hair follicle, this means stem cells produce less melanin – turning our hair grey – and less keratin, causing hair thinning and loss.
As less melanin is produced, there is less pigment to give the hair its colour. Grey hair has very little melanin, while white hair has none left.
Unpigmented hair looks grey, white or silver because light reflects off the keratin, which is pale yellow.
Grey hair is thicker, coarser and stiffer than hair with pigment. This is because the shape of the hair follicle becomes irregular as the stem cells change with age.
Interestingly, grey hair also grows faster than pigmented hair, but it uses more energy in the process.
Can stress turn our hair grey?
Yes, stress can cause your hair to turn grey. This happens when oxidative stress damages hair follicles and stem cells and stops them producing melanin.
Oxidative stress is an imbalance of too many damaging free radical chemicals and not enough protective antioxidant chemicals in the body. It can be caused by psychological or emotional stress as well as autoimmune diseases.
Environmental factors such as exposure to UV and pollution, as well as smoking and some drugs, can also play a role.
Melanocytes are more susceptible to damage than keratinocytes because of the complex steps in melanin production. This explains why ageing and stress usually cause hair greying before hair loss.
Scientists have been able to link less pigmented sections of a hair strand to stressful events in a person’s life. In younger people, whose stems cells still produced melanin, colour returned to the hair after the stressful event passed.
4 popular ideas about grey hair – and what science says
1. Does plucking a grey hair make more grow back in its place?
No. When you pluck a hair, you might notice a small bulb at the end that was attached to your scalp. This is the root. It grows from the hair follicle.
Plucking a hair pulls the root out of the follicle. But the follicle itself is the opening in your skin and can’t be plucked out. Each hair follicle can only grow a single hair.
It’s possible frequent plucking could make your hair grey earlier, if the cells that produce melanin are damaged or exhausted from too much regrowth.
2. Can my hair can turn grey overnight?
Legend says Marie Antoinette’s hair went completely white the night before the French queen faced the guillotine – but this is a myth.
It is not possible for hair to turn grey overnight, as in the legend about Marie Antoinette. Yann Caradec/Wikimedia, CC BY-NC-SA Melanin in hair strands is chemically stable, meaning it can’t transform instantly.
Acute psychological stress does rapidly deplete melanocyte stem cells in mice. But the effect doesn’t show up immediately. Instead, grey hair becomes visible as the strand grows – at a rate of about 1 cm per month.
Not all hair is in the growing phase at any one time, meaning it can’t all go grey at the same time.
3. Will dyeing make my hair go grey faster?
This depends on the dye.
Temporary and semi-permanent dyes should not cause early greying because they just coat the hair strand without changing its structure. But permanent products cause a chemical reaction with the hair, using an oxidising agent such as hydrogen peroxide.
Accumulation of hydrogen peroxide and other hair dye chemicals in the hair follicle can damage melanocytes and keratinocytes, which can cause greying and hair loss.
4. Is it true redheads don’t go grey?
People with red hair also lose melanin as they age, but differently to those with black or brown hair.
This is because the red-yellow and black-brown pigments are chemically different.
Producing the brown-black pigment eumelanin is more complex and takes more energy, making it more susceptible to damage.
Producing the red-yellow pigment (pheomelanin) causes less oxidative stress, and is more simple. This means it is easier for stem cells to continue to produce pheomelanin, even as they reduce their activity with ageing.
With ageing, red hair tends to fade into strawberry blonde and silvery-white. Grey colour is due to less eumelanin activity, so is more common in those with black and brown hair.
Your genetics determine when you’ll start going grey. But you may be able to avoid premature greying by staying healthy, reducing stress and avoiding smoking, too much alcohol and UV exposure.
Eating a healthy diet may also help because vitamin B12, copper, iron, calcium and zinc all influence melanin production and hair pigmentation.
Theresa Larkin, Associate Professor of Medical Sciences, University of Wollongong
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Share This Post
-
Signs Of Low Estrogen In Women: What Your Skin, Hair, & Nails Are Trying To Tell You
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
Skin, hair, and nails are often thought of purely as a beauty thing, but in fact they can be indicative of a lot of other aspects of health. Dr. Andrea Suarez takes us through some of them in this video about the systemic (i.e., whole-body, not just related to sex things) effects of estrogen, and/or a deficiency thereof.
Beyond the cosmetic
Low estrogen levels are usual in women during and after untreated menopause, resulting in various changes in the skin, hair, and nails, that reflect deeper issues, down to bone health, heart health, brain health, and more. Since we can’t see our bones or hearts or brains without scans (or a serious accident/incident), we’re going to focus on the outward signs of estrogen deficiency.
Estrogen helps maintain healthy collagen production, skin elasticity, wound healing, and moisture retention, making it essential for youthful and resilient skin. Declining estrogen levels with menopause lead to a thinner epidermis, decreased collagen production, and more pronounced wrinkles. Skin elasticity also diminishes, which slows the skin’s ability to recover from stretching or deformation. Wound healing also becomes slower, increasing the risk of infections and extended recovery periods after injuries or surgeries—bearing in mind that collagen is needed in everything from our skin to our internal connective tissue (fascia) and joints and bones. So all those things are going to struggle to recover from injury (and surgery is also an injury) without it.
Other visible changes associated with declining estrogen include significant dryness as a result of reduced hyaluronic acid and glycosaminoglycan production, which are essential for moisture retention. The skin becomes more prone to irritation and increased water loss. Additionally, estrogen deficiency results in less resistance to oxidative stress, making the skin more susceptible to damage from environmental factors such as UV radiation and pollution, as well as any from-the-inside pollution that some may have depending on diet and lifestyle.
Acne and enlarged pores are associated with increased testosterone, but testosterone and estrogen are antagonistic in most ways, and in this case a decrease in estrogen will do the same, due increased unopposed androgen signaling affecting the oil glands. The loss of supportive collagen also causes the skin around pores to lose structure, making them appear larger. The reduction in skin hydration further exacerbates the visibility of pores and can contribute to the development of blackheads due to abnormal cell turnover.
Blood vessel issues tend to arise as estrogen levels drop, leading to a reduction in angiogenesis, i.e. the formation and integrity of blood vessels. This results in more fragile and leaky blood vessels, making the skin more prone to bruising, especially on areas frequently exposed to the sun, such as the backs of the hands. This weakened vasculature also further contributes to the slower wound healing that we talked about, due to less efficient delivery of growth factors.
Hair and nail changes often accompany estrogen deficiency. Women may notice hair thinning, increased breakage, and a greater likelihood of androgenic alopecia. The texture of the hair can change, becoming more brittle. Similarly, nails can develop ridges, split more easily, and become more fragile due to reduced collagen and keratin production, which also affects the skin around the nails.
As for what to do about it? Management options for estrogen-deficient skin include:
- Bioidentical hormone replacement therapy (HRT), which can improve skin elasticity, boost collagen production, and reduce dryness and fragility, as well as addressing the many more serious internal things that are caused by the same deficiency as these outward signs.
- Low-dose topical estrogen cream, which can help alleviate skin dryness and increase skin strength, won’t give the systemic benefits (incl. to bones, heart, brain, etc) that only systemic HRT can yield.
- Plant-based phytoestrogens, which are not well-evidenced, but may be better than nothing if nothing is your only other option. However, if you are taking anything other form of estrogen, don’t use phytoestrogens as well, or they will compete for estrogen receptors, and do the job not nearly so well while impeding the bioidentical estrogen from doing its much better job.
And for all at any age, sunscreen continues to be one of the best things to put on one’s skin for general skin health, and this is even more true if running low on estrogen.
For more on all of this, enjoy:
Click Here If The Embedded Video Doesn’t Load Automatically!
Want to learn more?
You might also like:
These Signs Often Mean These Nutrient Deficiencies (Do You Have Any?)
Take care!
Share This Post
Related Posts
-
Egg Noodles vs Soba Noodles – Which is Healthier?
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
Our Verdict
When comparing egg noodles to soba noodles, we picked the soba.
Why?
First of all, for any unfamiliar, soba noodles are made with buckwheat. Buckwheat, for any unfamiliar, is not wheat and does not contain gluten; it’s just the name of a flowering plant that gets used as though a grain, even though it’s technically not.
In terms of macros, egg noodles have slightly more protein 2x the fat (of which, some cholesterol) while soba noodles have very slightly more carbs and 3x the fiber (and, being plant-based, no cholesterol). Given that the carbs are almost equal, it’s a case of which do we care about more: slightly more protein, or 3x the fiber? We’re going with 3x the fiber, and so are calling this category a win for soba.
In the category of vitamins, egg noodles have more of vitamins A, B12, C, D, E, K, and choline, while soba noodles have more of vitamins B1, B2, B3, B5, B6, and B9. That’s a 6:6 tie. One could argue that egg noodles’ vitamins are the ones more likely to be a deficiency in people, but on the other hand, soba noodles’ vitamins have the greater margins of difference. So, still a tie.
When it comes to minerals, egg noodles have more calcium and selenium, while soba noodles have more copper, iron, magnesium, manganese, phosphorus, potassium, and zinc. So, this one’s not close; it’s an easy win for soba noodles.
Adding up the sections makes for a clear win for soba noodles, but by all means, enjoy moderate portions of either or both (unless you are vegan or allergic to eggs, in which case, skip the egg noodles and just enjoy the soba!).
Want to learn more?
You might like to read:
Egg Noodles vs Rice Noodles – Which is Healthier?
Take care!
Don’t Forget…
Did you arrive here from our newsletter? Don’t forget to return to the email to continue learning!
Learn to Age Gracefully
Join the 98k+ American women taking control of their health & aging with our 100% free (and fun!) daily emails:
-
Mythbusting Moldy Food
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
Most Food Should Not Be Fuzzy
In yesterday’s newsletter, we asked you for your policy when it comes to mold on food (aside from intentional mold, e.g. blue cheese etc), and the responses were interesting:
- About 49% said “throw the whole thing away no matter what it is; it is dangerous”
- About 24% said “cut the mold off and eat the rest of whatever it is”
- The remainder were divided equally between “eat it all; keep the immune system on its toes” and “cut the mold off bread, but moldy animal products are dangerous”
So what does the science say?
Some molds are safe to eat: True or False?
True! We don’t think this is contentious so we’ll not spend much time on it, but just for the sake of being methodical: foods that are supposed to have mold on, including many kinds of cheese and even some kinds of cured meat (salami is an example; that powdery coating is mold).
We could give a big list of safe and unsafe molds, but that would be a list of names and let’s face it, they don’t introduce themselves by name.
However! The litmus test of “is it safe to eat” is:
Did you acquire it with this mold already in place and exactly as expected and advertised?
- If so, it is safe to eat (unless you have an allergy or such)
- If not, it is almost certainly not safe to eat
(more on why, later)
The “sniff test” is a good way to tell if moldy food is bad: True or False?
False. Very false. Because of how the sense of smell works.
You may feel like smell is a way of knowing about something at a distance, but the only way you can smell something is if particles of it are physically connecting with your olfactory receptors inside you. Yes, that has unfortunate implications about bathroom smells, but for now, let’s keep our attention in the kitchen.
If you sniff a moldy item of food, you will now have its mold spores inside your respiratory system. You absolutely do not want them there.
If we cut off the mold, the rest is safe to eat: True or False?
True or False, depending on what it is:
- Hard vegetables (e.g carrots, cabbage), and hard cheeses (e.g. Gruyère, Gouda) – cut off with an inch margin, and it should be safe
- Soft vegetables (e.g. tomatoes, and any vegetables that were hard but are now soft after cooking) – discard entirely; it is unsafe
- Anything else – discard entirely; it is unsafe
The reason for this is because in the case of the hard products mentioned, the mycelium roots of the mold cannot penetrate far.
In the case of the soft products mentioned, the surface mold is “the tip of the iceberg”, and the mycelium roots, which you will not usually be able to see, will penetrate the rest of it.
“Anything else” seems like quite a sweeping statement, but fruits, soft cheeses, yogurt, liquids, jams and jellies, cooked grains and pasta, meats, and yes, bread, are all things where the roots can penetrate deeply and easily. Regardless of you only being able to see a small amount, the whole thing is probably moldy.
The USDA has a handy downloadable factsheet:
Molds On Food: Are They Dangerous?
Eating a little mold is good for the immune system: True or False?
False, generally. There are of course countless types of mold, but not only are many of them pathogenic (mycotoxins), but also, a food that has mold will usually also have pathogenic bacteria along with the mold.
See for example: Occurrence, Toxicity, and Analysis of Major Mycotoxins in Food
Food poisoning will never make you healthier.
But penicillin is safe to eat: True or False?
False, and also penicillin is not the mold on your bread (or other foods).
Penicillin, an antibiotic* molecule, is produced by some species of Penicillium sp., a mold. There are hundreds of known species of Penicillium sp., and most of them are toxic, usually in multiple ways. Take for example:
Penicillium roqueforti PR toxin gene cluster characterization
*it is also not healthy to consume antibiotics unless it is seriously necessary. Antibiotics will wipe out most of your gut’s “good bacteria”, leaving you vulnerable. People have died from C. diff infections for this reason. So obviously, if you really need to take antibiotics, take them as directed, but if not, don’t.
See also: Four Ways Antibiotics Can Kill You
One last thing…
It may be that someone reading this is thinking “I’ve eaten plenty of mold, and I’m fine”. Or perhaps someone you tell about this will say that.
But there are two reasons this logic is flawed:
- Survivorship bias (like people who smoke and live to 102; we just didn’t hear from the 99.9% of people who smoke and die early)
- Being unaware of illness is not being absent of illness. Anyone who’s had an alarming diagnosis of something that started a while ago will know this, of course. It’s also possible to be “low-level ill” often and get used to it as a baseline for health. It doesn’t mean it’s not harmful for you.
Stay safe!
Don’t Forget…
Did you arrive here from our newsletter? Don’t forget to return to the email to continue learning!
Learn to Age Gracefully
Join the 98k+ American women taking control of their health & aging with our 100% free (and fun!) daily emails:
-
Could Just Two Hours Sleep Per Day Be Enough?
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
Polyphasic Sleep… Super-Schedule Or An Idea Best Put To Rest?
What is it?
Let’s start by defining some terms:
- Monophasic sleep—sleeping in one “chunk” per day. For example, a good night’s “normal” sleep.
- Biphasic sleep—sleeping in two “chunks” per day. Typically, a shorter night’s sleep, with a nap usually around the middle of the day / early afternoon.
- Polyphasic sleep—sleeping in two or more “chunks per day”. Some people do this in order to have more hours awake per day, to do things. The idea is that sleeping this way is more efficient, and one can get enough rest in less time. The most popular schedules used are:
- The Überman schedule—six evenly-spaced 20-minute naps, one every four hours, throughout the 24-hour day. The name is a semi-anglicized version of the German word Übermensch, “Superman”.
- The Everyman schedule—a less extreme schedule, that has a three-hours “long sleep” during the night, and three evenly-spaced 20-minute naps during the day, for a total of 4 hours sleep.
There are other schedules, but we’ll focus on the most popular ones here.
Want to learn about the others? Visit: Polyphasic.Net (a website by and for polyphasic sleep enthusiasts)
Some people have pointed to evidence that suggests humans are naturally polyphasic sleepers, and that it is only modern lifestyles that have forced us to be (mostly) monophasic.
There is at least some evidence to suggest that when environmental light/dark conditions are changed (because of extreme seasonal variation at the poles, or, as in this case, because of artificial changes as part of a sleep science experiment), we adjust our sleeping patterns accordingly.
The counterpoint, of course, is that perhaps when at the mercy of long days/nights at the poles, or no air-conditioning to deal with the heat of the day in the tropics, that perhaps we were forced to be polyphasic, and now, with modern technology and greater control, we are free to be monophasic.
Either way, there are plenty of people who take up the practice of polyphasic sleep.
Ok, But… Why?
The main motivation for trying polyphasic sleep is simply to have more hours in the day! It’s exciting, the prospect of having 22 hours per day to be so productive and still have time over for leisure.
A secondary motivation for trying polyphasic sleep is that when the brain is sleep-deprived, it will prioritize REM sleep. Here’s where the Überman schedule becomes perhaps most interesting:
The six evenly-spaced naps of the Überman schedule are each 20 minutes long. This corresponds to the approximate length of a normal REM cycle.
Consequently, when your head hits the pillow, you’ll immediately begin dreaming, and at the end of your dream, the alarm will go off.
Waking up at the end of a dream, when one hasn’t yet entered a non-REM phase of sleep, will make you more likely to remember it. Similarly, going straight into REM sleep will make you more likely to be aware of it, thus, lucid dreaming.
Read: Sleep fragmentation and lucid dreaming (actually a very interesting and informative lucid dreaming study even if you don’t want to take up polyphasic sleep)
Six 20-minute lucid-dreaming sessions per day?! While awake for the other 22 hours?! That’s… 24 hours per day of wakefulness to use as you please! What sorcery is this?
Hence, it has quite an understandable appeal.
Next Question: Does it work?
Can we get by without the other (non-REM) kinds of sleep?
According to Überman cycle enthusiasts: Yes! The body and brain will adapt.
According to sleep scientists: No! The non-REM slow-wave phases of sleep are essential
Read: Adverse impact of polyphasic sleep patterns in humans—Report of the National Sleep Foundation sleep timing and variability consensus panel
(if you want to know just how bad it is… the top-listed “similar article” is entitled “Suicidal Ideation”)
But what about, for example, the Everman schedule? Three hours at night is enough for some non-REM sleep, right?
It is, and so it’s not as quickly deleterious to the health as the Überman schedule. But, unless you are blessed with rare genes that allow you to operate comfortably on 4 hours per day (you’ll know already if that describes you, without having to run any experiment), it’s still bad.
Adults typically need 7–9 hours of sleep per night, and if you don’t get it, you’ll accumulate a sleep debt. And, importantly:
When you accumulate sleep debt, you are borrowing time at a very high rate of interest!
And, at risk of laboring the metaphor, but this is important too:
Not only will you have to pay it back soon (with interest), you will be hounded by the debt collection agents—decreased cognitive ability and decreased physical ability—until you pay up.
In summary:
- Polyphasic sleep is really very tempting
- It will give you more hours per day (for a while)
- It will give the promised lucid dreaming benefits (which is great until you start micronapping between naps, this is effectively a mini psychotic break from reality lasting split seconds each—can be deadly if behind the wheel of a car, for instance!)
- It is unequivocally bad for the health and we do not recommend it
Bottom line:
Some of the claimed benefits are real, but are incredibly short-term, unsustainable, and come at a cost that’s far too high. We get why it’s tempting, but ultimately, it’s self-sabotage.
(Sadly! We really wanted it to work, too…)
Don’t Forget…
Did you arrive here from our newsletter? Don’t forget to return to the email to continue learning!
Learn to Age Gracefully
Join the 98k+ American women taking control of their health & aging with our 100% free (and fun!) daily emails: