Does Eating Shellfish Contribute To Gout?

10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.

It’s Q&A Day at 10almonds!

Have a question or a request? We love to hear from you!

In cases where we’ve already covered something, we might link to what we wrote before, but will always be happy to revisit any of our topics again in the future too—there’s always more to say!

As ever: if the question/request can be answered briefly, we’ll do it here in our Q&A Thursday edition. If not, we’ll make a main feature of it shortly afterwards!

So, no question/request too big or small 😎

❝I have a question about seafood as healthy, doesn’t eating shellfish contribute to gout?❞

It can do! Gout (a kind of inflammatory arthritis characterized by the depositing of uric acid crystals in joints) has many risk factors, and diet is one component, albeit certainly the most talked-about one.

First, you may be wondering: isn’t all arthritis inflammatory? Since arthritis is by definition the inflammation of joints, this is a reasonable question, but when it comes to classifying the kinds, “inflammatory” arthritis is caused by inflammation, while “non-inflammatory” arthritis (a slightly confusing name) merely has inflammation as one of its symptoms (and is caused by physical wear-and-tear). For more information, see:

As for gout specifically, top risk factors include:

  • Increasing age: risk increases with age
  • Being male: women do get gout, but much less often
  • Hypertension: all-cause hypertension is the biggest reasonably controllable factor

There’s not a lot we can do about age (but of course, looking after our general health will tend to slow biological aging, and after all, diseases only care about the state of our body, not what the date on the calendar is).

As for sex, this risk factor is hormones, and specifically has to do with estrogen and testosterone’s very different effects on the immune system (bearing in mind that chronic inflammation is a disorder of the immune system). However, few if any men would take up feminizing hormone therapy just to lower their gout risk!

That leaves hypertension, which happily is something that we can all (barring extreme personal circumstances) do quite a bit about. Here’s a good starting point:

Hypertension: Factors Far More Relevant Than Salt

…and for further pointers:

How To Lower Your Blood Pressure (Cardiologists Explain)

As for diet specifically (and yes, shellfish):

The largest study into this (and thus, one of the top ones cited in a lot of other literature) looked at 47,150 men with no history of gout at the baseline.

So, with the caveat that their findings could have been different for women, they found:

  • Eating meat in general increased gout risk
    • Narrowing down specific meats: beef, pork, and lamb were the worst offenders
  • Eating seafood in general increased gout risk
    • Narrowing down specific seafoods: all seafoods increased gout risk within a similar range
    • As a specific quirk of seafoods: the risk was increased if the man had a BMI under 25
  • Eating dairy in general was not associated with an increased risk of gout
    • Narrowing down specific dairy foods: low-fat dairy products such as yogurt were associated with a decreased risk of gout
  • Eating purine-rich vegetables in general was not associated with an increased risk of gout
    • Narrowing down to specific purine-rich vegetables: no purine-rich vegetable was associated with an increase in the risk of gout

Dairy products were included in the study, as dairy products in general and non-fermented dairy products in particular are often associated with increased inflammation. However, the association was simply not found to exist when it came to gout risk.

Purine-rich vegetables were included in the study, as animal products highest in purines have typically been found to have the worst effect on gout. However, the association was simply not found to exist when it came to plants with purines.

You can read the full study here:

Purine-Rich Foods, Dairy and Protein Intake, and the Risk of Gout in Men

So, the short answer to your question of “doesn’t eating shellfish contribute to the risk of gout” is:

Yes, it can, but occasional consumption probably won’t result in gout unless you have other risk factors going against you.

If you’re a slim male 80-year-old alcoholic smoker with hypertension, then definitely do consider skipping the lobster, but honestly, there may be bigger issues to tackle there.

And similarly, obviously skip it if you have a shellfish allergy, and if you’re vegan or vegetarian or abstain from shellfish for religious reasons, then you can certainly live very healthily without ever having any.

See also: Do We Need Animal Products, To Be Healthy?

For most people most of the time, a moderate consumption of seafood, including shellfish if you so desire, is considered healthy.

As ever, do speak with your own doctor to know for sure, as your individual case may vary.

For reference, this question was surely prompted by the article:

Lobster vs Crab – Which is Healthier?

Take care!

Don’t Forget…

Did you arrive here from our newsletter? Don’t forget to return to the email to continue learning!

Recommended

  • Brown Rice Protein: Strengths & Weaknesses
  • Becoming a Supple Leopard – by Dr. Kelly Starrett and Glen Cordoza
    Unlock the secrets to fitness and body mastery with “Becoming A Supple Leopard”—your ultimate guide to peak physical performance.

Learn to Age Gracefully

Join the 98k+ American women taking control of their health & aging with our 100% free (and fun!) daily emails:

  • Black Pepper’s Impressive Anti-Cancer Arsenal

    10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.

    Black Pepper’s Impressive Anti-Cancer Arsenal (And More)

    Piperine, a compound found in Piper nigrum (black pepper, to its friends), has many health benefits. It’s included as a minor ingredient in some other supplements, because it boosts bioavailability. In its form as a kitchen spice, it’s definitely a superfood.

    What does it do?

    First, three things that generally go together:

    These things often go together for the simple reason that oxidative stress, inflammation, and cancer often go together. In each case, it’s a matter of cellular wear-and-tear, and what can mitigate that.

    For what it’s worth, there’s generally a fourth pillar: anti-aging. This is again for the same reason. That said, black pepper hasn’t (so far as we could find) been studied specifically for its anti-aging properties, so we can’t cite that here as an evidence-based claim.

    Nevertheless, it’s a reasonable inference that something that fights oxidation, inflammation, and cancer, will often also slow aging.

    Special note on the anti-cancer properties

    We noticed two very interesting things while researching piperine’s anti-cancer properties. It’s not just that it reduces cancer risk and slows tumor growth in extant cancers (as we might expect from the above-discussed properties). Let’s spotlight some studies:

    It is selectively cytotoxic (that’s a good thing)

    Piperine was found to be selectively cytotoxic to cancerous cells, while not being cytotoxic to non-cancerous cells. To this end, it’s a very promising cancer-sniper:

    Piperine as a Potential Anti-cancer Agent: A Review on Preclinical Studies

    It can reverse multi-drug resistance in cancer cells

    P-glycoprotein, found in our body, is a drug-transporter that is known for “washing out” chemotherapeutic drugs from cancer cells. To date, no drug has been approved to inhibit P-glycoprotein, but piperine has been found to do the job:

    Targeting P-glycoprotein: Investigation of piperine analogs for overcoming drug resistance in cancer

    What’s this about piperine analogs, though? Basically the researchers found a way to “tweak” piperine to make it even more effective. They called this tweaked version “Pip1”, because calling it by its chemical name,

    ((2E,4E)-5-(benzo[d][1,3]dioxol-5-yl)-1-(6,7-dimethoxy-3,4-dihydroisoquinolin-2(1 H)-yl)penta-2,4-dien-1-one)

    …got a bit unwieldy.

    The upshot is: Pip1 is better, but piperine itself is also good.

    Other benefits

    Piperine does have other benefits too, but the above is what we were most excited to talk about today. Its other benefits include:

    Enjoy!

    Share This Post

  • Rehab Science – by Dr. Tom Walters 

    10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.

    Many books of this kind deal with the injury but not the pain; some source talk about pain but not the injury; this one does both, and more.

    Dr. Walters discusses in detail the nature of pain, various different kinds of pain, the factors that influence pain, and, of course, how to overcome pain.

    He also takes us on a tour of various different categories of injury, because some require very different treatment than others, and while there are some catch-all “this is good/bad for healing” advices, sometimes what will help with one injury with hinder healing another. So, this information alone would make the book a worthwhile read already.

    After this two-part theory-heavy introduction, the largest part of the book is given over to rehab itself, in a practical fashion.

    We learn about how to make an appropriate rehab plan, get the material things we need for it (if indeed we need material things), and specific protocols to follow for various different body parts and injuries.

    The style is very much that of a textbook, well-formatted and with plenty of illustrations throughout (color is sometimes relevant, so we recommend a print edition over Kindle for this one).

    Bottom line: if you have an injury to heal, or even just believe in being prepared, this book is an excellent guide.

    Click here to check out Rehab Science, to overcome pain and heal from injury!

    Share This Post

  • 4 Ways Vaccine Skeptics Mislead You on Measles and More

    10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.

    Measles is on the rise in the United States. In the first quarter of this year, the number of cases was about 17 times what it was, on average, during the same period in each of the four years before, according to the Centers for Disease Control and Prevention. Half of the people infected — mainly children — have been hospitalized.

    It’s going to get worse, largely because a growing number of parents are deciding not to get their children vaccinated against measles as well as diseases like polio and pertussis. Unvaccinated people, or those whose immunization status is unknown, account for 80% of the measles cases this year. Many parents have been influenced by a flood of misinformation spouted by politicians, podcast hosts, and influential figures on television and social media. These personalities repeat decades-old notions that erode confidence in the established science backing routine childhood vaccines. KFF Health News examined the rhetoric and explains why it’s misguided:

    The No-Big-Deal Trope

    A common distortion is that vaccines aren’t necessary because the diseases they prevent are not very dangerous, or too rare to be of concern. Cynics accuse public health officials and the media of fear-mongering about measles even as 19 states report cases.

    For example, an article posted on the website of the National Vaccine Information Center — a regular source of vaccine misinformation — argued that a resurgence in concern about the disease “is ‘sky is falling’ hype.” It went on to call measles, mumps, chicken pox, and influenza “politically incorrect to get.”

    Measles kills roughly 2 of every 1,000 children infected, according to the CDC. If that seems like a bearable risk, it’s worth pointing out that a far larger portion of children with measles will require hospitalization for pneumonia and other serious complications. For every 10 measles cases, one child with the disease develops an ear infection that can lead to permanent hearing loss. Another strange effect is that the measles virus can destroy a person’s existing immunity, meaning they’ll have a harder time recovering from influenza and other common ailments.

    Measles vaccines have averted the deaths of about 94 million people, mainly children, over the past 50 years, according to an April analysis led by the World Health Organization. Together with immunizations against polio and other diseases, vaccines have saved an estimated 154 million lives globally.

    Some skeptics argue that vaccine-preventable diseases are no longer a threat because they’ve become relatively rare in the U.S. (True — due to vaccination.) This reasoning led Florida’s surgeon general, Joseph Ladapo, to tell parents that they could send their unvaccinated children to school amid a measles outbreak in February. “You look at the headlines and you’d think the sky was falling,” Ladapo said on a News Nation newscast. “There’s a lot of immunity.”

    As this lax attitude persuades parents to decline vaccination, the protective group immunity will drop, and outbreaks will grow larger and faster. A rapid measles outbreak hit an undervaccinated population in Samoa in 2019, killing 83 people within four months. A chronic lack of measles vaccination in the Democratic Republic of the Congo led to more than 5,600 people dying from the disease in massive outbreaks last year.

    The ‘You Never Know’ Trope

    Since the earliest days of vaccines, a contingent of the public has considered them bad because they’re unnatural, as compared with nature’s bounty of infections and plagues. “Bad” has been redefined over the decades. In the 1800s, vaccine skeptics claimed that smallpox vaccines caused people to sprout horns and behave like beasts. More recently, they blame vaccines for ailments ranging from attention-deficit/hyperactivity disorder to autism to immune system disruption. Studies don’t back the assertions. However, skeptics argue that their claims remain valid because vaccines haven’t been adequately tested.

    In fact, vaccines are among the most studied medical interventions. Over the past century, massive studies and clinical trials have tested vaccines during their development and after their widespread use. More than 12,000 people took part in clinical trials of the most recent vaccine approved to prevent measles, mumps, and rubella. Such large numbers allow researchers to detect rare risks, which are a major concern because vaccines are given to millions of healthy people.

    To assess long-term risks, researchers sift through reams of data for signals of harm. For example, a Danish group analyzed a database of more than 657,000 children and found that those who had been vaccinated against measles as babies were no more likely to later be diagnosed with autism than those who were not vaccinated. In another study, researchers analyzed records from 805,000 children born from 1990 through 2001 and found no evidence to back a concern that multiple vaccinations might impair children’s immune systems.

    Nonetheless, people who push vaccine misinformation, like candidate Robert F. Kennedy Jr., dismiss massive, scientifically vetted studies. For example, Kennedy argues that clinical trials of new vaccines are unreliable because vaccinated kids aren’t compared with a placebo group that gets saline solution or another substance with no effect. Instead, many modern trials compare updated vaccines with older ones. That’s because it’s unethical to endanger children by giving them a sham vaccine when the protective effect of immunization is known. In a 1950s clinical trial of polio vaccines, 16 children in the placebo group died of polio and 34 were paralyzed, said Paul Offit, director of the Vaccine Education Center at Children’s Hospital of Philadelphia and author of a book on the first polio vaccine.

    The Too-Much-Too-Soon Trope

    Several bestselling vaccine books on Amazon promote the risky idea that parents should skip or delay their children’s vaccines. “All vaccines on the CDC’s schedule may not be right for all children at all times,” writes Paul Thomas in his bestselling book “The Vaccine-Friendly Plan.” He backs up this conviction by saying that children who have followed “my protocol are among the healthiest in the world.”

    Since the book was published, Thomas’ medical license was temporarily suspended in Oregon and Washington. The Oregon Medical Board documented how Thomas persuaded parents to skip vaccines recommended by the CDC, and reported that he “reduced to tears” a mother who disagreed.  Several children in his care came down with pertussis and rotavirus, diseases easily prevented by vaccines, wrote the board. Thomas recommended fish oil supplements and homeopathy to an unvaccinated child with a deep scalp laceration, rather than an emergency tetanus vaccine. The boy developed severe tetanus, landing in the hospital for nearly two months, where he required intubation, a tracheotomy, and a feeding tube to survive.

    The vaccination schedule recommended by the CDC has been tailored to protect children at their most vulnerable points in life and minimize side effects. The combination measles, mumps, and rubella vaccine isn’t given for the first year of a baby’s life because antibodies temporarily passed on from their mother can interfere with the immune response. And because some babies don’t generate a strong response to that first dose, the CDC recommends a second one around the time a child enters kindergarten because measles and other viruses spread rapidly in group settings.

    Delaying MMR doses much longer may be unwise because data suggests that children vaccinated at 10 or older have a higher chance of adverse reactions, such as a seizure or fatigue.

    Around a dozen other vaccines have discrete timelines, with overlapping windows for the best response. Studies have shown that MMR vaccines may be given safely and effectively in combination with other vaccines.

    ’They Don’t Want You to Know’ Trope

    Kennedy compares the Florida surgeon general to Galileo in the introduction to Ladapo’s new book on transcending fear in public health. Just as the Roman Catholic inquisition punished the renowned astronomer for promoting theories about the universe, Kennedy suggests that scientific institutions oppress dissenting voices on vaccines for nefarious reasons.

    “The persecution of scientists and doctors who dare to challenge contemporary orthodoxies is not a new phenomenon,” Kennedy writes. His running mate, lawyer Nicole Shanahan, has campaigned on the idea that conversations about vaccine harms are censored and the CDC and other federal agencies hide data due to corporate influence.

    Claims like “they don’t want you to know” aren’t new among the anti-vaccine set, even though the movement has long had an outsize voice. The most listened-to podcast in the U.S., “The Joe Rogan Experience,” regularly features guests who cast doubt on scientific consensus. Last year on the show, Kennedy repeated the debunked claim that vaccines cause autism.

    Far from ignoring that concern, epidemiologists have taken it seriously. They have conducted more than a dozen studies searching for a link between vaccines and autism, and repeatedly found none. “We have conclusively disproven the theory that vaccines are connected to autism,” said Gideon Meyerowitz-Katz, an epidemiologist at the University of Wollongong in Australia. “So, the public health establishment tends to shut those conversations down quickly.”

    Federal agencies are transparent about seizures, arm pain, and other reactions that vaccines can cause. And the government has a program to compensate individuals whose injuries are scientifically determined to result from them. Around 1 to 3.5 out of every million doses of the measles, mumps, and rubella vaccine can cause a life-threatening allergic reaction; a person’s lifetime risk of death by lightning is estimated to be as much as four times as high.

    “The most convincing thing I can say is that my daughter has all her vaccines and that every pediatrician and public health person I know has vaccinated their kids,” Meyerowitz-Katz said. “No one would do that if they thought there were serious risks.”

    KFF Health News is a national newsroom that produces in-depth journalism about health issues and is one of the core operating programs at KFF—an independent source of health policy research, polling, and journalism. Learn more about KFF.

    Subscribe to KFF Health News’ free Morning Briefing.

    Share This Post

Related Posts

  • Brown Rice Protein: Strengths & Weaknesses
  • Hope: A research-based explainer

    10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.

    This year, more than 60 countries, representing more than 4 billion people, will hold major elections. News headlines already are reporting that voters are hanging on to hope. When things get tough or don’t go our way, we’re told to hang on to hope. HOPE was the only word printed on President Barack Obama’s iconic campaign poster in 2008.

    Research on hope has flourished only in recent decades. There’s now a growing recognition that hope has a role in physical, social, and mental health outcomes, including promoting resilience. As we embark on a challenging year of news, it’s important for journalists to learn about hope.

    So what is hope? And what does the research say about it?

    Merriam-Webster defines hope as a “desire accompanied by expectation of or belief in fulfillment.” This definition highlights the two basic dimensions of hope: a desire and a belief in the possibility of attaining that desire.

    Hope is not Pollyannaish optimism, writes psychologist Everett Worthington in a 2020 article for The Conversation. “Instead, hope is a motivation to persevere toward a goal or end state, even if we’re skeptical that a positive outcome is likely.”

    There are several scientific theories about hope.

    One of the first, and most well-known, theories on hope was introduced in 1991 by American psychologist Charles R. Snyder.

    In a paper published in the Journal of Personality and Social Psychology, Snyder defined hope as a cognitive trait centered on the pursuit of goals and built on two components: a sense of agency in achieving a goal, and a perceived ability to create pathways to achieve that goal. He defined hope as something individualistic.

    Snyder also introduced the Hope Scale, which continues to be used today, as a way to measure hope. He suggested that some people have higher levels of hope than others and there seem to be benefits to being more hopeful.

    “For example, we would expect that higher as compared with lower hope people are more likely to have a healthy lifestyle, to avoid life crises, and to cope better with stressors when they are encountered,” they write.

    Others have suggested broader definitions.

    In 1992, Kaye Herth, a professor of nursing and a scholar on hope, defined hope as “a multidimensional dynamic life force characterized by a confident yet uncertain expectation of achieving good, which to the hoping person, is realistically possible and personally significant.” Herth also developed the Herth Hope Index, which is used in various settings, including clinical practice and research.

    More recently, others have offered an even broader definition of hope.

    Anthony Scioli, a clinical psychologist and author of several books on hope, defines hope “as an emotion with spiritual dimensions,” in a 2023 review published in Current Opinion in Psychology. “Hope is best viewed as an ameliorating emotion, designed to fill the liminal space between need and reality.”

    Hope is also nuanced.

    “Our hopes may be active or passive, patient or critical, private or collective, grounded in the evidence or resolute in spite of it, socially conservative or socially transformative,” writes Darren Webb in a 2007 study published in History of the Human Sciences. “We all hope, but we experience this most human of all mental feelings in a variety of modes.”

    To be sure, a few studies have shown that hope can have negative outcomes in certain populations and situations. For example, one study highlighted in the research roundup below finds that Black college students who had higher levels of hope experienced more stress due to racial discrimination compared with Black students who had lower levels of hope.

    Today, hope is one of the most well-studied constructs within the field of positive psychology, according to the journal Current Opinion in Psychology, which dedicated its August 2023 issue to the subject. (Positive psychology is a branch of psychology focused on characters and behaviors that allow people to flourish.)

    We’ve gathered several studies below to help you think more deeply about hope and recognize its role in your everyday lives.

    Research roundup

    The Role of Hope in Subsequent Health and Well-Being For Older Adults: An Outcome-Wide Longitudinal Approach
    Katelyn N.G. Long, et al. Global Epidemiology, November 2020.

    The study: To explore the potential public health implications of hope, researchers examine the relationship between hope and physical, behavioral and psychosocial outcomes in 12,998 older adults in the U.S. with a mean age of 66.

    Researchers note that most investigations on hope have focused on psychological and social well-being outcomes and less attention has been paid to its impact on physical and behavioral health, particularly among older adults.

    The findings: Results show a positive association between an increased sense of hope and a variety of behavioral and psychosocial outcomes, such as fewer sleep problems, more physical activity, optimism and satisfaction with life. However, there wasn’t a clear association between hope and all physical health outcomes. For instance, hope was associated with a reduced number of chronic conditions, but not with stroke, diabetes and hypertension.

    The takeaway: “The later stages of life are often defined by loss: the loss of health, loved ones, social support networks, independence, and (eventually) loss of life itself,” the authors write. “Our results suggest that standard public health promotion activities, which often focus solely on physical health, might be expanded to include a wider range of factors that may lead to gains in hope. For example, alongside community-based health and nutrition programs aimed at reducing chronic conditions like hypertension, programs that help strengthen marital relations (e.g., closeness with a spouse), provide opportunities to volunteer, help lower anxiety, or increase connection with friends may potentially increase levels of hope, which in turn, may improve levels of health and well-being in a variety of domains.”

    Associated Factors of Hope in Cancer Patients During Treatment: A Systematic Literature Review
    Corine Nierop-van Baalen, Maria Grypdonck, Ann van Hecke and Sofie Verhaeghe. Journal of Advanced Nursing, March 2020.

    The study: The authors review 33 studies, written in English or Dutch and published in the past decade, on the relationship between hope and the quality of life and well-being of patients with cancer. Studies have shown that many cancer patients respond to their diagnosis by nurturing hope, while many health professionals feel uneasy when patients’ hopes go far beyond their prognosis, the authors write.

    The findings: Quality of life, social support and spiritual well-being were positively associated with hope, as measured with various scales. Whereas symptoms, psychological distress and depression had a negative association with hope. Hope didn’t seem to be affected by the type or stage of cancer or the patient’s demographics.

    The takeaway: “Hope seems to be a process that is determined by a person’s inner being rather than influenced from the outside,” the authors write. “These factors are typically given meaning by the patients themselves. Social support, for example, is not about how many patients experience support, but that this support has real meaning for them.”

    Characterizing Hope: An Interdisciplinary Overview of the Characteristics of Hope
    Emma Pleeging, Job van Exel and Martijn Burger. Applied Research in Quality of Life, September 2021.

    The study: This systematic review provides an overview of the concept of hope based on 66 academic papers in ten academic fields, including economics and business studies, environmental studies, health studies, history, humanities, philosophy, political science, psychology, social science, theology and youth studies, resulting in seven themes and 41 sub-themes.

    The findings: The authors boil down their findings to seven components: internal and external sources, the individual and social experience of hope, internal and external effects, and the object of hope, which can be “just about anything we can imagine,” the authors write.

    The takeaway: “An important implication of these results lies in the way hope is measured in applied and scientific research,” researchers write. “When measuring hope or developing instruments to measure it, researchers could be well-advised to take note of the broader understanding of the topic, to prevent that important characteristics might be overlooked.”

    Revisiting the Paradox of Hope: The Role of Discrimination Among First-Year Black College Students
    Ryon C. McDermott, et al. Journal of Counseling Psychology, March 2020.

    The study: Researchers examine the moderating effects of hope on the association between experiencing racial discrimination, stress and academic well-being among 203 first-year U.S. Black college students. They build on a small body of evidence that suggests high levels of hope might have a negative effect on Black college students who experience racial discrimination.

    The authors use data gathered as part of an annual paper-and-pencil survey of first-year college students at a university on the Gulf Coast, which the study doesn’t identify.

    The findings: Researchers find that Black students who had higher levels of hope experienced more stress due to racial discrimination compared with students who had lower levels of hope. On the other hand, Black students with low levels of hope may be less likely to experience stress when they encounter discrimination.

    Meanwhile, Black students who had high levels of hope were more successful in academic integration — which researchers define as satisfaction with and integration into the academic aspects of college life — despite facing discrimination. But low levels of hope had a negative impact on students’ academic well-being.

    “The present study found evidence that a core construct in positive psychology, hope, may not always protect Black students from experiencing the psychological sting of discrimination, but it was still beneficial to their academic well-being,” the authors write.

    The takeaway: “Our findings also highlight an urgent need to reduce discrimination on college campuses,” the researchers write. “Reducing discrimination could help Black students (and other racial minorities) avoid additional stress, as well as help them realize the full psychological and academic benefits of having high levels of hope.”

    Additional reading

    Hope Across Cultural Groups Lisa M. Edwards and Kat McConnell. Current Opinion in Psychology, February 2023.

    The Psychology of Hope: A Diagnostic and Prescriptive Account Anthony Scioli. “Historical and Multidisciplinary Perspectives on Hope,” July 2020.

    Hope Theory: Rainbows in the Mind C.R. Snyder. Psychological Inquiry, 2002

    This article first appeared on The Journalist’s Resource and is republished here under a Creative Commons license.

    Don’t Forget…

    Did you arrive here from our newsletter? Don’t forget to return to the email to continue learning!

    Learn to Age Gracefully

    Join the 98k+ American women taking control of their health & aging with our 100% free (and fun!) daily emails:

  • Beyond Supplements: The Real Immune-Boosters!

    10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.

    The Real Immune-Boosters

    What comes to your mind when we say “immune support”? Vitamin C and maybe zinc? Those have their place, but there are things we can do that are a lot more important!

    It’s just, these things are not talked about as much, because stores can’t sell them to you

    Sleep

    One of the biggest difference-makers. Get good sleep! Getting at least 7 hours decent sleep (not lying in bed, not counting interruptions to sleep as part of the sleep duration) can improve your immune system by three or four times.

    Put another way, people are 3–4 times more likely to get sick if they get less sleep than that on average.

    Check it out: Behaviorally Assessed Sleep and Susceptibility to the Common Cold

    Eat an anti-inflammatory diet

    In short, for most of us this means lots of whole plant foods (lots of fiber), and limited sugar, flour, alcohol.

    For more details, you can see our main feature on this: Keep Inflammation At Bay!

    You may wonder why eating to reduce inflammation (inflammation is a form of immune response) will help improve immune response. Put it this way:

    If your town’s fire service is called out eleventy-two times per day to deal with things that are not, in fact, fires, then when there is a fire, they will be already exhausted, and will not do their job so well.

    Look after your gut microbiota

    Additionally, healthy gut microbiota (fostered by the same diet we just described) help keep your body pathogen-free, by avoiding “leaky gut syndrome” that occurs when, for example, C. albicans (you do not want this in your gut, and it thrives on the things we just told you to avoid) puts its roots through your intestinal walls, making holes in them. And through those holes? You definitely do not want bacteria from your intestines going into the rest of your body.

    See also: Gut Health 101

    Actually get that moderate exercise

    There’s definitely a sweet-spot here, because too much exercise will also exhaust you and deplete your body’s resources. However, the famous “150 minutes per week” (so, a little over 20 minutes per day, or 25 minutes per day with one day off) will make a big difference.

    See: Exercise and the Regulation of Immune Functions

    Manage your stress levels (good and bad!)

    This one swings both ways:

    • Acute stress (like a cold shower) is good for immune response. Think of it like a fire drill for your body.
    • Chronic stress (“the general everything” persistently stressful in life) is bad for immune response. This is the fire drill that never ends. Your body’s going to know what to do really well, but it’s going to be exhausted already by the time an actual threat hits.

    Read more: Effects of Stress on Immune Function: the Good, the Bad, and the Beautiful

    Supplement, yes.

    These are far less critical than the above things, but are also helpful. Good things to take include:

    Enjoy, and stay well!

    Don’t Forget…

    Did you arrive here from our newsletter? Don’t forget to return to the email to continue learning!

    Learn to Age Gracefully

    Join the 98k+ American women taking control of their health & aging with our 100% free (and fun!) daily emails:

  • Is white rice bad for me? Can I make it lower GI or healthier?

    10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.

    Rice is a culinary staple in Australia and around the world.

    It might seem like a given that brown rice is healthier than white and official public health resources often recommend brown rice instead of white as a “healthy swap”.

    But Australians definitely prefer white rice over brown. So, what’s the difference, and what do we need to know when choosing rice?

    Dragne Marius/Unsplash

    What makes rice white or brown?

    Rice “grains” are technically seeds. A complete, whole rice seed is called a “paddy”, which has multiple parts:

    1. the “hull” is the hard outer layer which protects the seed
    2. the “bran”, which is a softer protective layer containing the seed coat
    3. the “germ” or the embryo, which is the part of the seed that would develop into a new plant if was germinated
    4. the “endosperm”, which makes up most of the seed and is essentially the store of nutrients that feeds the developing plant as a seed grows into a plant.

    Rice needs to be processed for humans to eat it.

    Along with cleaning and drying, the hard hulls are removed since we can’t digest them. This is how brown rice is made, with the other three parts of the rice remaining intact. This means brown rice is regarded as a “wholegrain”.

    White rice, however, is a “refined” grain, as it is further polished to remove the bran and germ, leaving just the endosperm. This is a mechanical and not a chemical process.

    What’s the difference, nutritionally?

    Keeping the bran and the germ means brown rice has more magnesium, phosphorus, potassium B vitamins (niacin, folate, riboflavin and pyridoxine), iron, zinc and fibre.

    The germ and the bran also contain more bioactives (compounds in foods that aren’t essential nutrients but have health benefits), like oryzanols and phenolic compounds which have antioxidant effects.

    Brown rice
    Brown rice is cleaned and dried and the hard hulls are removed. Sung Min/Shutterstock

    But that doesn’t mean white rice is just empty calories. It still contains vitamins, minerals and some fibre, and is low in fat and salt, and is naturally gluten-free.

    White and brown rice actually have similar amounts of calories (or kilojoules) and total carbohydrates.

    There are studies that show eating more white rice is linked to a higher risk of type 2 diabetes. But it is difficult to know if this is down to the rice itself, or other related factors such as socioeconomic variables or other dietary patterns.

    What about the glycaemic index?

    The higher fibre means brown rice has a lower glycaemic index (GI), meaning it raises blood sugar levels more slowly. But this is highly variable between different rices within the white and brown categories.

    The GI system uses low (less than 55), medium (55–70) and high (above 70) categories. Brown rices fall into the low and medium categories. White rices fall in the medium and high.

    There are specific low-GI types available for both white and brown types. You can also lower the GI of rice by heating and then cooling it. This process converts some of the “available carbohydrates” into “resistant starch”, which then functions like dietary fibre.

    Are there any benefits to white rice?

    The taste and textural qualities of white and brown rices differ. White rice tends to have a softer texture and more mild or neutral flavour. Brown rice has a chewier texture and nuttier flavour.

    So, while you can technically substitute brown rice into most recipes, the experience will be different. Or other ingredients may need to be added or changed to create the desired texture.

    Removing more of the outer layers may also reduce the levels of contaminants such as pesticides.

    We don’t just eat rice

    Friends eat dinner on a rooftop terrace
    You’ll likely have vegetables and protein with your rice. Chay_Tee/Shutterstock

    Comparing white and brown rice seems like an easy way to boost nutritional value. But just because one food (brown rice) is more nutrient-dense doesn’t make the other food (white rice) “bad”.

    Ultimately, it’s not often that we eat just rice, so we don’t need the rice we choose to be the perfect one. Rice is typically the staple base of a more complex dish. So, it’s probably more important to think about what we eat with rice.

    Adding vegetables and lean proteins to rice-based dishes can easily add the micronutrients, bioactives and fibre that white rice is comparatively lacking, and this can likely do more to contribute to diet quality than eating brown rice instead.

    Emma Beckett, Adjunct Senior Lecturer, Nutrition, Dietetics & Food Innovation – School of Health Sciences, UNSW Sydney

    This article is republished from The Conversation under a Creative Commons license. Read the original article.

    Don’t Forget…

    Did you arrive here from our newsletter? Don’t forget to return to the email to continue learning!

    Learn to Age Gracefully

    Join the 98k+ American women taking control of their health & aging with our 100% free (and fun!) daily emails: