I’m feeling run down. Why am I more likely to get sick? And how can I boost my immune system?

10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.

It has been a long winter, filled with many viruses and cost-of-living pressures, on top of the usual mix of work, study, life admin and caring responsibilities.

Stress is an inevitable part of life. In short bursts, our stress response has evolved as a survival mechanism to help us be more alert in fight or flight situations.

But when stress is chronic, it weakens the immune system and makes us more vulnerable to illnesses such as the common cold, flu and COVID.

Pexels/Ketut Subiyanto

Stress makes it harder to fight off viruses

When the immune system starts to break down, a virus that would normally have been under control starts to flourish.

Once you begin to feel sick, the stress response rises, making it harder for the immune system to fight off the disease. You may be sick more often and for longer periods of time, without enough immune cells primed and ready to fight.

In the 1990s, American psychology professor Sheldon Cohen and his colleagues conducted a number of studies where healthy people were exposed to an upper respiratory infection, through drops of virus placed directly into their nose.

These participants were then quarantined in a hotel and monitored closely to determine who became ill.

One of the most important factors predicting who got sick was prolonged psychological stress.

Cortisol suppresses immunity

“Short-term stress” is stress that lasts for a period of minutes to hours, while “chronic stress” persists for several hours per day for weeks or months.

When faced with a perceived threat, psychological or physical, the hypothalamus region of the brain sets off an alarm system. This signals the release of a surge of hormones, including adrenaline and cortisol.

Human brain illustration
The hypothalamus sets off an alarm system in response to a real or perceived threat. stefan3andrei/Shutterstock

In a typical stress response, cortisol levels quickly increase when stress occurs, and then rapidly drop back to normal once the stress has subsided. In the short term, cortisol suppresses inflammation, to ensure the body has enough energy available to respond to an immediate threat.

But in the longer term, chronic stress can be harmful. A Harvard University study from 2022 showed that people suffering from psychological distress in the lead up to their COVID infection had a greater chance of experiencing long COVID. They classified this distress as depression, probable anxiety, perceived stress, worry about COVID and loneliness.

Those suffering distress had close to a 50% greater risk of long COVID compared to other participants. Cortisol has been shown to be high in the most severe cases of COVID.

Stress causes inflammation

Inflammation is a short-term reaction to an injury or infection. It is responsible for trafficking immune cells in your body so the right cells are present in the right locations at the right times and at the right levels.

The immune cells also store a memory of that threat to respond faster and more effectively the next time.

Initially, circulating immune cells detect and flock to the site of infection. Messenger proteins, known as pro-inflammatory cytokines, are released by immune cells, to signal the danger and recruit help, and our immune system responds to neutralise the threat.

During this response to the infection, if the immune system produces too much of these inflammatory chemicals, it can trigger symptoms such as nasal congestion and runny nose.

Man blows nose
Our immune response can trigger symptoms such as a runny nose. Alyona Mandrik/Shutterstock

What about chronic stress?

Chronic stress causes persistently high cortisol secretion, which remains high even in the absence of an immediate stressor.

The immune system becomes desensitised and unresponsive to this cortisol suppression, increasing low-grade “silent” inflammation and the production of pro-inflammatory cytokines (the messenger proteins).

Immune cells become exhausted and start to malfunction. The body loses the ability to turn down the inflammatory response.

Over time, the immune system changes the way it responds by reprogramming to a “low surveillance mode”. The immune system misses early opportunities to destroy threats, and the process of recovery can take longer.

So how can you manage your stress?

We can actively strengthen our immunity and natural defences by managing our stress levels. Rather than letting stress build up, try to address it early and frequently by:

1) Getting enough sleep

Getting enough sleep reduces cortisol levels and inflammation. During sleep, the immune system releases cytokines, which help fight infections and inflammation.

2) Taking regular exercise

Exercising helps the lymphatic system (which balances bodily fluids as part of the immune system) circulate and allows immune cells to monitor for threats, while sweating flushes toxins. Physical activity also lowers stress hormone levels through the release of positive brain signals.

3) Eating a healthy diet

Ensuring your diet contains enough nutrients – such as the B vitamins, and the full breadth of minerals like magnesium, iron and zinc – during times of stress has a positive impact on overall stress levels. Staying hydrated helps the body to flush out toxins.

4) Socialising and practising meditation or mindfulness

These activities increase endorphins and serotonin, which improve mood and have anti-inflammatory effects. Breathing exercises and meditation stimulate the parasympathetic nervous system, which calms down our stress responses so we can “reset” and reduce cortisol levels.

Sathana Dushyanthen, Academic Specialist & Lecturer in Cancer Sciences & Digital Health| Superstar of STEM| Science Communicator, The University of Melbourne

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Don’t Forget…

Did you arrive here from our newsletter? Don’t forget to return to the email to continue learning!

Recommended

  • What’s the difference between period pain and endometriosis pain?
  • How To Leverage Placebo Effect For Yourself
    Delve into the placebo effect’s potent influence on medical treatments and leverage its powers for health—whether it’s pill size, rituals, or the power of belief.

Learn to Age Gracefully

Join the 98k+ American women taking control of their health & aging with our 100% free (and fun!) daily emails:

  • 7 Fruits Every Senior Should Eat Today (And Why)

    10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.

    What will you prioritize in the new year?

    Fruits to enjoy regularly

    The 7 fruits recommended for seniors in this video are:

    Apples

    • Rich in soluble fiber (pectin) for lowering LDL cholesterol.
    • Contains phytochemicals such as quercetin and other polyphenols that reduce inflammation and support heart health.
    • High in vitamin C for immunity, skin elasticity, and joint health.

    Bananas

    • Natural energy boost from carbohydrates.
    • High in potassium for regulating blood pressure, fluid balance, and preventing muscle cramps.
    • Supports cardiovascular health and muscle function.

    Avocados

    • Rich in monounsaturated fats to improve cholesterol levels.
    • High in potassium for blood pressure regulation.
    • Contains vitamins E and K for brain health and bone density.

    Grapes

    • Hydrating and rich in antioxidants like resveratrol, which supports circulation and reduces inflammation.
    • Contain vitamins C and K for immunity and bone health.

    Plums

    • Natural laxative with high fiber and sorbitol for digestive health.
    • Rich in potassium and vitamin K for bone density and reducing osteoporosis risk.
    • Contain polyphenols for reducing inflammation and supporting cognitive health.

    Pomegranates

    • Anti-inflammatory and antioxidant-rich (especially punicalagins and anthocyanins).
    • Supports heart health, improves cholesterol levels, and promotes brain health.
    • May help inhibit cancer cell growth in specific types.

    Kiwi

    • High in vitamin C to boost immunity.
    • Rich in fiber and enzymes for digestive health.
    • Low glycemic index, suitable for blood sugar management.

    10almonds note: a lot of those statements can go for a lot of fruits, but those are definitely high on the list for the qualities mentioned!

    For more on all the above, enjoy:

    Click Here If The Embedded Video Doesn’t Load Automatically!

    Want to learn more?

    You might also like to read:

    Top 8 Fruits That Prevent & Kill Cancer ← there are two fruits that appear on both lists!

    Take care!

    Share This Post

  • Parsley vs Spinach – Which is Healthier?

    10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.

    Our Verdict

    When comparing parsley to spinach, we picked the parsley.

    Why?

    First of all, writer’s anecdote: today’s choice brought to you by a real decision here in my household! You see, a certain dish I sometimes prepare (it’s just a wrap-based dish, nothing fancy) requires a greenery component, and historically I’ve used kale or spinach. Of those two, I prefer kale while my son, who lives (and dines) with me, prefers spinach. However, we both like parsley equally, so I’m going to use that today. But I was curious about how it performed nutritionally, hence today’s comparison!

    Ok, now for the stats…

    In terms of macros, the only difference is that parsley has more fiber and carbs, for an approximately equal glycemic index, so we’ll go with the one with the highest total fiber, which is parsley.

    In the category of vitamins, parsley has more of vitamins B3, B5, B7, B9, C, and K, while spinach has more of vitamins A, B2, B6, E, and choline. So, a marginal 6:5 win for parsley (and in the margins of difference are also in parsley’s favor, for example parsley has 13x the vitamin C, and 2x or 3x the other vitamins it won with, while spinach boasts 2x for some vitamins, and only 1.2x or 1.5x the others).

    When it comes to minerals, parsley has more iron, phosphorus, potassium, and zinc, while spinach has more copper, magnesium, manganese, and selenium. So, a 4:4 tie on these.

    In terms of phytochemicals, parsley has a much higher polyphenol content (that’s good) while spinach has a much higher oxalate content (that’s neutral for most people, but bad if you have certain kidney problems). So, another win for parsley.

    Adding up the sections makes a clear overall win for parsley, but by all means enjoy either or both, unless you are avoiding oxalates, in which case, the oxalates in spinach can be reduced by cooking, but honestly, for most dishes you might as well just pick a different greens option (like parsley, or collard greens if you want something closer to the culinary experience of eating spinach).

    Want to learn more?

    You might like:

    Invigorating Sabzi Khordan ← another great way to enjoy parsley as main ingredient rather than just a seasoning

    Enjoy!

    Share This Post

  • Is owning a dog good for your health?

    10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.

    Australia loves dogs. We have one of the highest rates of pet ownership in the world, and one in two households has at least one dog.

    But are they good for our health?

    Mental health is the second-most common reason cited for getting a dog, after companionship. And many of us say we “feel healthier” for having a dog – and let them sleep in our bedroom.

    Here’s what it means for our physical and mental health to share our homes (and doonas) with our canine companions.

    Pogodina Natalia/Shutterstock

    Are there physical health benefits to having a dog?

    Having a dog is linked to lower risk of death over the long term. In 2019, a systematic review gathered evidence published over 70 years, involving nearly four million individual medical cases. It found people who owned a dog had a 24% lower risk of dying from any cause compared to those who did not own a dog.

    A golden retriever pants on the grass next to a ball.
    Having a dog may help lower your blood pressure through more physical activity. Barnabas Davoti/Pexels

    Dog ownership was linked to increased physical activity. This lowered blood pressure and helped reduce the risk of stroke and heart disease.

    The review found for those with previous heart-related medical issues (such as heart attack), living with a dog reduced their subsequent risk of dying by 35%, compared to people with the same history but no dog.

    Another recent UK study found adult dog owners were almost four times as likely to meet daily physical activity targets as non-owners. Children in households with a dog were also more active and engaged in more unstructured play, compared to children whose family didn’t have a dog.

    Exposure to dirt and microbes carried in from outdoors may also strengthen immune systems and lead to less use of antibiotics in young children who grow up with dogs.

    A boy in sunglasses talks to his jack russell terrier on a beach.
    Children in households with a dog were often more active. Maryshot/Shutterstock

    Health risks

    However, dogs can also pose risks to our physical health. One of the most common health issues for pet owners is allergies.

    Dogs’ saliva, urine and dander (the skin cells they shed) can trigger allergic reactions resulting in a range of symptoms, from itchy eyes and runny nose to breathing difficulties.

    A recent meta-analysis pooled data from nearly two million children. Findings suggested early exposure to dogs may increase the risk of developing asthma (although not quite as much as having a cat does). The child’s age, how much contact they have with the dog and their individual risk all play a part.

    Slips, trips and falls are another risk – more people fall over due to dogs than cats.

    Having a dog can also expose you to bites and scratches which may become infected and pose a risk for those with compromised immune systems. And they can introduce zoonotic diseases into your home, including ring worm and Campylobacter, a disease that causes diarrhoea.

    For those sharing the bed there is an elevated the risk of allergies and picking up ringworm. It may result in lost sleep, as dogs move around at night.

    On the other hand some owners report feeling more secure while co-sleeping with their dogs, with the emotional benefit outweighing the possibility of sleep disturbance or waking up with flea bites.

    Proper veterinary care and hygiene practices are essential to minimise these risks.

    A dog peers out from under a doona while a man sleeps.
    Many of us don’t just share a home with a dog – we let them sleep in our beds. Claudia Mañas/Unsplash

    What about mental health?

    Many people know the benefits of having a dog are not only physical.

    As companions, dogs can provide significant emotional support helping to alleviate symptoms of anxiety, depression and post-traumatic stress. Their presence may offer comfort and a sense of purpose to individuals facing mental health challenges.

    Loneliness is a significant and growing public health issue in Australia.

    In the dog park and your neighbourhood, dogs can make it easier to strike up conversations with strangers and make new friends. These social interactions can help build a sense of community belonging and reduce feelings of social isolation.

    For older adults, dog walking can be a valuable loneliness intervention that encourages social interaction with neighbours, while also combating declining physical activity.

    However, if you’re experiencing chronic loneliness, it may be hard to engage with other people during walks. An Australian study found simply getting a dog was linked to decreased loneliness. People reported an improved mood – possibly due to the benefits of strengthening bonds with their dog.

    Three dogs on leash sniff each other.
    Walking a dog can make it easier to talk to people in your neighbourhood. KPegg/Shutterstock

    What are the drawbacks?

    While dogs can bring immense joy and numerous health benefits, there are also downsides and challenges. The responsibility of caring for a dog, especially one with behavioural issues or health problems, can be overwhelming and create financial stress.

    Dogs have shorter lifespans than humans, and the loss of a beloved companion can lead to depression or exacerbate existing mental health conditions.

    Lifestyle compatibility and housing conditions also play a significant role in whether having a dog is a good fit.

    The so-called pet effect suggests that pets, often dogs, improve human physical and mental health in all situations and for all people. The reality is more nuanced. For some, having a pet may be more stressful than beneficial.

    Importantly, the animals that share our homes are not just “tools” for human health. Owners and dogs can mutually benefit when the welfare and wellbeing of both are maintained.

    Tania Signal, Professor of Psychology, School of Health, Medical and Applied Sciences, CQUniversity Australia

    This article is republished from The Conversation under a Creative Commons license. Read the original article.

    Share This Post

Related Posts

  • What’s the difference between period pain and endometriosis pain?
  • Prolonged Grief: A New Mental Disorder?

    10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.

    The issue is not whether certain mental conditions are real—they are. It is how we conceptualize them and what we think treating them requires.

    The latest edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5) features a new diagnosis: prolonged grief disorder—used for those who, a year after a loss, still remain incapacitated by it. This addition follows more than a decade of debate. Supporters argued that the addition enables clinicians to provide much-needed help to those afflicted by what one might simply consider a too much of grief, whereas opponents insisted that one mustn’t unduly pathologize grief and reject an increasingly medicalized approach to a condition that they considered part of a normal process of dealing with loss—a process which in some simply takes longer than in others.    

    By including a condition in a professional classification system, we collectively recognize it as real. Recognizing hitherto unnamed conditions can help remove certain kinds of disadvantages. Miranda Fricker emphasizes this in her discussion of what she dubs hermeneutic injustice: a specific sort of epistemic injustice that affects persons in their capacity as knowers1. Creating terms like ‘post-natal depression’ and ‘sexual harassment’, Fricker argues, filled lacunae in the collectively available hermeneutic resources that existed where names for distinctive kinds of social experience should have been. The absence of such resources, Fricker holds, put those who suffered from such experiences at an epistemic disadvantage: they lacked the words to talk about them, understand them, and articulate how they were wronged. Simultaneously, such absences prevented wrong-doers from properly understanding and facing the harm they were inflicting—e.g. those who would ridicule or scold mothers of newborns for not being happier or those who would either actively engage in sexual harassment or (knowingly or not) support the societal structures that helped make it seem as if it was something women just had to put up with. 

    For Fricker, the hermeneutical disadvantage faced by those who suffer from an as-of-yet ill-understood and largely undiagnosed medical condition is not an epistemic injustice. Those so disadvantaged are not excluded from full participation in hermeneutic practices, or at least not through mechanisms of social coercion that arise due to some structural identity prejudice. They are not, in other words, hermeneutically marginalized, which for Fricker, is an essential characteristic of epistemic injustice. Instead, their situation is simply one of “circumstantial epistemic bad luck”2. Still, Fricker, too, can agree that providing labels for ill-understood conditions is valuable. Naming a condition helps raise awareness of it, makes it discursively available and, thus, a possible object of knowledge and understanding. This, in turn, can enable those afflicted by it to understand their experience and give those who care about them another way of nudging them into seeking help. 

    Surely, if adding prolonged grief disorder to the DSM-5 were merely a matter of recognizing the condition and of facilitating assistance, nobody should have any qualms with it. However, the addition also turns intense grief into a mental disorder—something for whose treatment insurance companies can be billed. With this, significant forces of interest enter the scene. The DSM-5, recall, is mainly consulted by psychiatrists. In contrast to talk-therapists like psychotherapists or psychoanalysts, psychiatrists constitute a highly medicalized profession, in which symptoms—clustered together as syndromes or disorders—are frequently taken to require drugs to treat them. Adding prolonged grief disorder thus heralds the advent of research into various drug-based grief therapies. Ellen Barry of the New York Times confirms this: “naltrexone, a drug used to help treat addiction,” she reports, “is currently in clinical trials as a form of grief therapy”, and we are likely to see a “competition for approval of medicines by the Food and Drug Administration.”3

    Adding diagnoses to the DSM-5 creates financial incentives for players in the pharmaceutical industry to develop drugs advertised as providing relief to those so diagnosed. Surely, for various conditions, providing drug-induced relief from severe symptoms is useful, even necessary to enable patients to return to normal levels of functioning. But while drugs may help suppress feelings associated with intense grief, they cannot remove the grief. If all mental illnesses were brain diseases, they might be removed by adhering to some drug regimen or other. Note, however, that ‘mental illness’ is a metaphor that carries the implicit suggestion that just like physical illnesses, mental afflictions, too, are curable by providing the right kind of physical treatment. Unsurprisingly, this metaphor is embraced by those who stand to massively benefit from what profits they may reap from selling a plethora of drugs to those diagnosed with any of what seems like an ever-increasing number of mental disorders. But metaphors have limits. Lou Marinoff, a proponent of philosophical counselling, puts the point aptly:

    Those who are dysfunctional by reason of physical illness entirely beyond their control—such as manic-depressives—are helped by medication. For handling that kind of problem, make your first stop a psychiatrist’s office. But if your problem is about identity or values or ethics, your worst bet is to let someone reify a mental illness and write a prescription. There is no pill that will make you find yourself, achieve your goals, or do the right thing.

    Much more could be said about the differences between psychotherapy, psychiatry, and the newcomer in the field: philosophical counselling. Interested readers may benefit from consulting Marinoff’s work. Written in a provocative, sometimes alarmist style, it is both entertaining and—if taken with a substantial grain of salt—frequently insightful. My own view is this: from Fricker’s work, we can extract reasons to side with the proponents of adding prolonged grief disorder to the DSM-5. Creating hermeneutic resources that allow us to help raise awareness, promote understanding, and facilitate assistance is commendable. If the addition achieves that, we should welcome it. And yet, one may indeed worry that practitioners are too eager to move from the recognition of a mental condition to the implementation of therapeutic interventions that are based on the assumption that such afflictions must be understood on the model of physical disease. The issue is not whether certain mental conditions are real—they are. It is how we conceptualize them and what we think treating them requires.

    No doubt, grief manifests physically. It is, however, not primarily a physical condition—let alone a brain disease. Grief is a distinctive mental condition. Apart from bouts of sadness, its symptoms typically include the loss of orientation or a sense of meaning. To overcome grief, we must come to terms with who we are or can be without the loved one’s physical presence in our life. We may need to reinvent ourselves, figure out how to be better again and whence to derive a new purpose. What is at stake is our sense of identity, our self-worth, and, ultimately, our happiness. Thinking that such issues are best addressed by popping pills puts us on a dangerous path, leading perhaps towards the kind of dystopian society Aldous Huxley imagined in his 1932 novel Brave New World. It does little to help us understand, let alone address, the moral and broader philosophical issues that trouble the bereaved and that lie at the root not just of prolonged grief but, arguably, of many so-called mental illnesses.

    Footnotes:

    1 For this and the following, cf. Fricker 2007, chapter 7.

    2 Fricker 2007: 152

    3 Barry 2022

    References:

    Barry, E. (2022). “How Long Should It Take to Grieve? Psychiatry Has Come Up With an Answer.” The New York Times, 03/18/2022, URL = https://www.nytimes.com/2022/03/18/health/prolonged-grief-
    disorder.html [last access: 04/05/2022])
    Fricker, M. (2007). Epistemic Injustice. Power & the Ethics of knowing. Oxford/New York: Oxford University Press.
    Huxley, A. (1932). Brave New World. New York: Harper Brothers.
    Marinoff, L. (1999). Plato, not Prozac! New York: HarperCollins Publishers.

    Professor Raja Rosenhagen is currently serving as Assistant Professor of Philosophy, Head of Department, and Associate Dean of Academic Affairs at Ashoka University. He earned his PhD in Philosophy from the University of Pittsburgh and has a broad range of philosophical interests (see here). He wrote this article a) because he was invited to do so and b) because he is currently nurturing a growing interest in philosophical counselling.

    This article is republished from OpenAxis under a Creative Commons license. Read the original article.

    Don’t Forget…

    Did you arrive here from our newsletter? Don’t forget to return to the email to continue learning!

    Learn to Age Gracefully

    Join the 98k+ American women taking control of their health & aging with our 100% free (and fun!) daily emails:

  • What are nootropics and do they really boost your brain?

    10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.

    Humans have long been searching for a “magic elixir” to make us smarter, and improve our focus and memory. This includes traditional Chinese medicine used thousands of years ago to improve cognitive function.

    Now we have nootropics, also known as smart drugs, brain boosters or cognitive enhancers.

    You can buy these gummies, chewing gums, pills and skin patches online, or from supermarkets, pharmacies or petrol stations. You don’t need a prescription or to consult a health professional.

    But do nootropics actually boost your brain? Here’s what the science says.

    LuckyStep/Shutterstock

    What are nootropics and how do they work?

    Romanian psychologist and chemist Cornelius E. Giurgea coined the term nootropics in the early 1970s to describe compounds that may boost memory and learning. The term comes from the Greek words nӧos (thinking) and tropein (guide).

    Nootropics may work in the brain by improving transmission of signals between nerve cells, maintaining the health of nerve cells, and helping in energy production. Some nootropics have antioxidant properties and may reduce damage to nerve cells in the brain caused by the accumulation of free radicals.

    But how safe and effective are they? Let’s look at four of the most widely used nootropics.

    1. Caffeine

    You might be surprised to know caffeine is a nootropic. No wonder so many of us start our day with a coffee. It stimulates our nervous system.

    Caffeine is rapidly absorbed into the blood and distributed in nearly all human tissues. This includes the brain where it increases our alertness, reaction time and mood, and we feel as if we have more energy.

    For caffeine to have these effects, you need to consume 32-300 milligrams in a single dose. That’s equivalent to around two espressos (for the 300mg dose). So, why the wide range? Genetic variations in a particular gene (the CYP1A2 gene) can affect how fast you metabolise caffeine. So this can explain why some people need more caffeine than others to recognise any neurostimulant effect.

    Unfortunately too much caffeine can lead to anxiety-like symptoms and panic attacks, sleep disturbances, hallucinations, gut disturbances and heart problems.

    So it’s recommended adults drink no more than 400mg caffeine a day, the equivalent of up to three espressos.

    Two blue coffee cups on wooden table, one with coffee art, the other empty
    Caffeine can make you feel alert and can boost your mood. That makes it a nootropic. LHshooter/Shutterstock

    2. L-theanine

    L-theanine comes as a supplement, chewing gum or in a beverage. It’s also the most common amino acid in green tea.

    Consuming L-theanine as a supplement may increase production of alpha waves in the brain. These are associated with increased alertness and perception of calmness.

    However, it’s effect on cognitive functioning is still unclear. Various studies including those comparing a single dose with a daily dose for several weeks, and in different populations, show different outcomes.

    But taking L-theanine with caffeine as a supplement improved cognitive performance and alertness in one study. Young adults who consumed L-theanine (97mg) plus caffeine (40mg) could more accurately switch between tasks after a single dose, and said they were more alert.

    Another study of people who took L-theanine with caffeine at similar doses to the study above found improvements in several cognitive outcomes, including being less susceptible to distraction.

    Although pure L-theanine is well tolerated, there are still relatively few human trials to show it works or is safe over a prolonged period of time. Larger and longer studies examining the optimal dose are also needed.

    Two clear mugs of green tea, with leaves on wooden table
    The amino acid L-theanine is also in green tea. grafvision/Shutterstock

    3. Ashwaghanda

    Ashwaghanda is a plant extract commonly used in Indian Ayurvedic medicine for improving memory and cognitive function.

    In one study, 225-400mg daily for 30 days improved cognitive performance in healthy males. There were significant improvements in cognitive flexibility (the ability to switch tasks), visual memory (recalling an image), reaction time (response to a stimulus) and executive functioning (recognising rules and categories, and managing rapid decision making).

    There are similar effects in older adults with mild cognitive impairment.

    But we should be cautious about results from studies using Ashwaghanda supplements; the studies are relatively small and only treated participants for a short time.

    Ashwagandha is a plant extract
    Ashwaghanda is a plant extract commonly used in Ayurvedic medicine. Agnieszka Kwiecień, Nova/Wikimedia, CC BY-SA

    4. Creatine

    Creatine is an organic compound involved in how the body generates energy and is used as a sports supplement. But it also has cognitive effects.

    In a review of available evidence, healthy adults aged 66-76 who took creatine supplements had improved short-term memory.

    Long-term supplementation may also have benefits. In another study, people with fatigue after COVID took 4g a day of creatine for six months and reported they were better able to concentrate, and were less fatigued. Creatine may reduce brain inflammation and oxidative stress, to improve cognitive performance and reduce fatigue.

    Side effects of creatine supplements in studies are rarely reported. But they include weight gain, gastrointestinal upset and changes in the liver and kidneys.

    Where to now?

    There is good evidence for brain boosting effects of caffeine and creatine. But the jury is still out on the efficacy, optimal dose and safety of most other nootropics.

    So until we have more evidence, consult your health professional before taking a nootropic.

    But drinking your daily coffee isn’t likely to do much harm. Thank goodness, because for some of us, it is a magic elixir.

    Nenad Naumovski, Professor in Food Science and Human Nutrition, University of Canberra; Amanda Bulman, PhD candidate studying the effects of nutrients on sleep, University of Canberra, and Andrew McKune, Professor, Exercise Science, University of Canberra

    This article is republished from The Conversation under a Creative Commons license. Read the original article.

    Don’t Forget…

    Did you arrive here from our newsletter? Don’t forget to return to the email to continue learning!

    Learn to Age Gracefully

    Join the 98k+ American women taking control of their health & aging with our 100% free (and fun!) daily emails:

  • Grains: Bread Of Life, Or Cereal Killer?

    10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.

    Going Against The Grain?

    In Wednesday’s newsletter, we asked you for your health-related opinion of grains (aside from any gluten-specific concerns), and got the above-depicted, below-described, set of responses:

    • About 69% said “They are an important cornerstone of a healthy balanced diet”
    • About 22% said “They can be enjoyed in moderation, but watch out”
    • About 8% said “They are terrible health-drainers that will kill us”

    So, what does the science say?

    They are terrible health-drainers that will kill us: True or False?

    True or False depending on the manner of their consumption!

    There is a big difference between the average pizza base and a bowl of oats, for instance. Or rather, there are a lot of differences, but what’s most critical here?

    The key is: refined and ultraprocessed grains are so inferior to whole grains as to be actively negative for health in most cases for most people most of the time.

    But! It’s not because processing is ontologically evil (in reality: some processed foods are healthy, and some unprocessed foods are poisonous). although it is a very good general rule of thumb.

    So, we need to understand the “why” behind the “key” that we just gave above, and that’s mostly about the resultant glycemic index and associated metrics (glycemic load, insulin index, etc).

    In the case of refined and ultraprocessed grains, our body gains sugar faster than it can process it, and stores it wherever and however it can, like someone who has just realised that they will be entertaining a houseguest in 10 minutes and must tidy up super-rapidly by hiding things wherever they’ll fit.

    And when the body tries to do this with sugar from refined grains, the result is very bad for multiple organs (most notably the liver, but the pancreas takes quite a hit too) which in turn causes damage elsewhere in the body, not to mention that we now have urgently-produced fat stored in unfortunate places like our liver and abdominal cavity when it should have gone to subcutaneous fat stores instead.

    In contrast, whole grains come with fiber that slows down the absorption of the sugars, such that the body can deal with them in an ideal fashion, which usually means:

    • using them immediately, or
    • storing them as muscle glycogen, or
    • storing them as subcutaneous fat

    👆 that’s an oversimplification, but we only have so much room here.

    For more on this, see:

    Glycemic Index vs Glycemic Load vs Insulin Index

    And for why this matters, see:

    Which Sugars Are Healthier, And Which Are Just The Same?

    And for fixing it, see:

    How To Unfatty A Fatty Liver

    They can be enjoyed in moderation, but watch out: True or False?

    Technically True but functionally False:

    • Technically true: “in moderation” is doing a lot of heavy lifting here. One person’s “moderation” may be another person’s “abstemiousness” or “gluttony”.
    • Functionally false: while of course extreme consumption of pretty much anything is going to be bad, unless you are Cereals Georg eating 10,000 cereals each day and being a statistical outlier, the issue is not the quantity so much as the quality.

    Quality, we discussed above—and that is, as we say, paramount. As for quantity however, you might want to know a baseline for “getting enough”, so…

    They are an important cornerstone of a healthy balanced diet: True or False?

    True! This one’s quite straightforward.

    3 servings (each being 90g, or about ½ cup) of whole grains per day is associated with a 22% reduction in risk of heart disease, 5% reduction in all-cause mortality, and a lot of benefits across a lot of disease risks:

    ❝This meta-analysis provides further evidence that whole grain intake is associated with a reduced risk of coronary heart disease, cardiovascular disease, and total cancer, and mortality from all causes, respiratory diseases, infectious diseases, diabetes, and all non-cardiovascular, non-cancer causes.

    These findings support dietary guidelines that recommend increased intake of whole grain to reduce the risk of chronic diseases and premature mortality.❞

    ~ Dr. Dagfinn Aune et al.

    Read in full: Whole grain consumption and risk of cardiovascular disease, cancer, and all cause and cause specific mortality: systematic review and dose-response meta-analysis of prospective studies

    We’d like to give a lot more sources for the same findings, as well as papers for all the individual claims, but frankly, there are so many that there isn’t room. Suffice it to say, this is neither controversial nor uncertain; these benefits are well-established.

    Here’s a very informative pop-science article, that also covers some of the things we discussed earlier (it shows what happens during refinement of grains) before getting on to recommendations and more citations for claims than we can fit here:

    Harvard School Of Public Health | Whole Grains

    “That’s all great, but what if I am concerned about gluten?”

    There certainly are reasons you might be, be it because of a sensitivity, allergy, or just because perhaps you’d like to know more.

    Let’s first mention: not all grains contain gluten, so it’s perfectly possible to enjoy naturally gluten-free grains (such as oats and rice) as well as gluten-free pseudocereals, which are not actually grains but do the same job in culinary and nutritional terms (such as quinoa and buckwheat, despite the latter’s name).

    Finally, if you’d like to know more about gluten’s health considerations, then check out our previous mythbusting special:

    Gluten: What’s The Truth?

    Enjoy!

    Don’t Forget…

    Did you arrive here from our newsletter? Don’t forget to return to the email to continue learning!

    Learn to Age Gracefully

    Join the 98k+ American women taking control of their health & aging with our 100% free (and fun!) daily emails: