MSG vs. Salt: Sodium Comparison
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
It’s Q&A Day at 10almonds!
Q: Is MSG healthier than salt in terms of sodium content or is it the same or worse?
Great question, and for that matter, MSG itself is a great topic for another day. But your actual question, we can readily answer here and now:
- Firstly, by “salt” we’re assuming from context that you mean sodium chloride.
- Both salt and MSG do contain sodium. However…
- MSG contains only about a third of the sodium that salt does, gram-for-gram.
- It’s still wise to be mindful of it, though. Same with sodium in other ingredients!
- Baking soda contains about twice as much sodium, gram for gram, as MSG.
Wondering why this happens?
Salt (sodium chloride, NaCl) is equal parts sodium and chlorine, by atom count, but sodium’s atomic mass is lower than chlorine’s, so 100g of salt contains only 39.34g of sodium.
Baking soda (sodium bicarbonate, NaHCO₃) is one part sodium for one part hydrogen, one part carbon, and three parts oxygen. Taking each of their diverse atomic masses into account, we see that 100g of baking soda contains 27.4g sodium.
MSG (monosodium glutamate, C₅H₈NO₄Na) is only one part sodium for 5 parts carbon, 8 parts hydrogen, 1 part nitrogen, and 4 parts oxygen… And all those other atoms put together weigh a lot (comparatively), so 100g of MSG contains only 12.28g sodium.
Don’t Forget…
Did you arrive here from our newsletter? Don’t forget to return to the email to continue learning!
Recommended
Learn to Age Gracefully
Join the 98k+ American women taking control of their health & aging with our 100% free (and fun!) daily emails:
-
Eat To Beat Cancer
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
Controlling What We Can, To Avoid Cancer
Every time a cell in our body is replaced, there’s a chance it will be cancerous. Exactly what that chance is depends on very many factors. Some of them we can’t control; others, we can.
Diet is a critical, modifiable factor
We can’t choose, for example, our genes. We can, for the most part, choose our diet. Why “for the most part”?
- Some people live in a food desert (the Arctic Circle is a good example where food choices are limited by supply)
- Some people have dietary restrictions (whether by health condition e.g. allergy, intolerance, etc or by personal-but-unwavering choice, e.g. vegetarian, vegan, kosher, halal, etc)
But for most of us, most of the time, we have a good control over our diet, and so that’s an area we can and should focus on.
Choose your animal protein wisely
If you are vegan, you can skip this section. If you are not, then the short version is:
- Fish: almost certainly fine
- Poultry: the jury is out; data is leaning towards fine, though
- Red meat: significantly increased cancer risk
- Processed meat: significantly increased cancer risk
For more details (and a run-down on the science behind the above super-summarized version):
- Do We Need Animal Products To Be Healthy? ← A mythbuster article that outlines many health properties (good and bad) of animal products
- The Whys and Hows of Cutting Meats Out Of Your Diet ← A life-hack article about acting on that information
Skip The Ultra-Processed Foods
Ok, so this one’s probably not a shocker in its simplest form:
❝Studies are showing us is that not only do the ultraprocessed foods increase the risk of cancer, but that after a cancer diagnosis such foods increase the risk of dying❞
Source: Is there a connection between ultraprocessed food and cancer?
There’s an unfortunate implication here! If you took the previous advice to heart and cut out [at least some] meat, and/but then replaced that with ultra-processed synthetic meat, then this was not a great improvement in cancer risk terms.
Ultra-processed meat is worse than unprocessed, regardless of whether it was from an animal or was synthetic.
In other words: if you buy textured soy pieces (a common synthetic meat), it pays to look at the ingredients, because there’s a difference between:
- INGREDIENTS: SOY
- INGREDIENTS: Rehydrated Textured SOY Protein (52%), Water, Rapeseed Oil, SOY Protein Concentrate, Seasoning (SULPHITES) (Dextrose, Flavourings, Salt, Onion Powder, Food Starch Modified, Yeast Extract, Colour: Red Iron Oxide), SOY Leghemoglobin, Fortified WHEAT Flour (WHEAT Flour, Calcium Carbonate, Iron, Niacin, Thiamin), Bamboo Fibre, Methylcellulose, Tomato Purée, Salt, Raising Agent: Ammonium Carbonates
Now, most of those original base ingredients are/were harmless per se (as are/were the grapes in wine—before processing into alcohol), but it has clearly been processed to Hell and back to do all that.
Choose the one that just says “soy”. Or eat soybeans. Or other beans. Or lentils. Really there are a lot of options.
About soy, by the way…
There is (mostly in the US, mostly funded by the animal agriculture industry) a lot of fearmongering about soy. Which is ironic, given the amount of soy that is fed to livestock to be fed to humans, but it does bear addressing:
❝Soy foods are safe for all cancer patients and are an excellent source of plant protein. Studies show soy may improve survival after breast cancer❞
Source: Food risks and cancer: What to avoid
(obviously, if you have a soy allergy then you should not consume soy—for most people, the above advice stands, though)
Advanced Glycation End-Products
These (which are Very Bad™ for very many things, including cancer) occur specifically as a result of processing animal proteins and fats.
Note: not even necessarily ultra-processing, just processing can do it. But ultra-processing is worse. What’s the difference, you wonder?
The difference between “ultra-processed” and just “processed”:
- Your average hotdog has been ultra-processed. It’s not only usually been changed with many artificial additives, it’s also been through a series of processes (physical and chemical) and ends up bearing little relation to the creature it came from.
- Your bacon (that you bought fresh from your local butcher, not a supermarket brand of unknown provenance, and definitely not the kind that might come on the top of frozen supermarket pizza) has been processed. It’s undergone a couple of simple processes on its journey “from farm to table”. Remember also that when you cook it, that too is one more process (and one that results in a lot of AGEs).
Read more: What’s so bad about AGEs?
Note if you really don’t want to cut out certain foods, changing the way you cook them (i.e., the last process your food undergoes before you eat it) can also reduce AGES:
Advanced Glycation End Products in Foods and a Practical Guide to Their Reduction in the Diet
Get More Fiber
❝The American Institute for Cancer Research shows that for every 10-gram increase in fiber in the diet, you improve survival after cancer diagnosis by 13%❞
Source: Plant-based diet is encouraged for patients with cancer
Yes, that’s post-diagnosis, but as a general rule of thumb, what is good/bad for cancer when you have it is good/bad for cancer beforehand, too.
If you’re thinking that increasing your fiber intake means having to add bran to everything, happily there are better ways:
Level-Up Your Fiber Intake! (Without Difficulty Or Discomfort)
Enjoy!
Share This Post
-
We don’t all need regular skin cancer screening – but you can know your risk and check yourself
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
Australia has one of the highest skin cancer rates globally, with nearly 19,000 Australians diagnosed with invasive melanoma – the most lethal type of skin cancer – each year.
While advanced melanoma can be fatal, it is highly treatable when detected early.
But Australian clinical practice guidelines and health authorities do not recommend screening for melanoma in the general population.
Given our reputation as the skin cancer capital of the world, why isn’t there a national screening program? Australia currently screens for breast, cervical and bowel cancer and will begin lung cancer screening in 2025.
It turns out the question of whether to screen everyone for melanoma and other skin cancers is complex. Here’s why.
Pixel-Shot/Shutterstock The current approach
On top of the 19,000 invasive melanoma diagnoses each year, around 28,000 people are diagnosed with in-situ melanoma.
In-situ melanoma refers to a very early stage melanoma where the cancerous cells are confined to the outer layer of the skin (the epidermis).
Instead of a blanket screening program, Australia promotes skin protection, skin awareness and regular skin checks (at least annually) for those at high risk.
About one in three Australian adults have had a clinical skin check within the past year.
Those with fairer skin or a family history may be at greater risk of skin cancer. Halfpoint/Shutterstock Why not just do skin checks for everyone?
The goal of screening is to find disease early, before symptoms appear, which helps save lives and reduce morbidity.
But there are a couple of reasons a national screening program is not yet in place.
We need to ask:
1. Does it save lives?
Many researchers would argue this is the goal of universal screening. But while universal skin cancer screening would likely lead to more melanoma diagnoses, this might not necessarily save lives. It could result in indolent (slow-growing) cancers being diagnosed that might have never caused harm. This is known as “overdiagnosis”.
Screening will pick up some cancers people could have safely lived with, if they didn’t know about them. The difficulty is in recognising which cancers are slow-growing and can be safely left alone.
Receiving a diagnosis causes stress and is more likely to lead to additional medical procedures (such as surgeries), which carry their own risks.
2. Is it value for money?
Implementing a nationwide screening program involves significant investment and resources. Its value to the health system would need to be calculated, to ensure this is the best use of resources.
Narrower targets for better results
Instead of screening everyone, targeting high-risk groups has shown better results. This focuses efforts where they’re needed most. Risk factors for skin cancer include fair skin, red hair, a history of sunburns, many moles and/or a family history.
Research has shown the public would be mostly accepting of a risk-tailored approach to screening for melanoma.
There are moves underway to establish a national targeted skin cancer screening program in Australia, with the government recently pledging $10.3 million to help tackle “the most common cancer in our sunburnt country, skin cancer” by focusing on those at greater risk.
Currently, Australian clinical practice guidelines recommend doctors properly evaluate all patients for their future risk of melanoma.
Looking with new technological eyes
Technological advances are improving the accuracy of skin cancer diagnosis and risk assessment.
For example, researchers are investigating 3D total body skin imaging to monitor changes to spots and moles over time.
Artificial intelligence (AI) algorithms can analyse images of skin lesions, and support doctors’ decision making.
Genetic testing can now identify risk markers for more personalised screening.
And telehealth has made remote consultations possible, increasing access to specialists, particularly in rural areas.
Check yourself – 4 things to look for
Skin cancer can affect all skin types, so it’s a good idea to become familiar with your own skin. The Skin Cancer College Australasia has introduced a guide called SCAN your skin, which tells people to look for skin spots or areas that are:
1. sore (scaly, itchy, bleeding, tender) and don’t heal within six weeks
2. changing in size, shape, colour or texture
3. abnormal for you and look different or feel different, or stand out when compared to your other spots and moles
4. new and have appeared on your skin recently. Any new moles or spots should be checked, especially if you are over 40.
If something seems different, make an appointment with your doctor.
You can self-assess your melanoma risk online via the Melanoma Institute Australia or QIMR Berghofer Medical Research Institute.
H. Peter Soyer, Professor of Dermatology, The University of Queensland; Anne Cust, Professor of Cancer Epidemiology, The Daffodil Centre and Melanoma Institute Australia, University of Sydney; Caitlin Horsham, Research Manager, The University of Queensland, and Monika Janda, Professor in Behavioural Science, The University of Queensland
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Share This Post
-
The Science of Nutrition – by Rhiannon Lambert
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
While there are a lot of conflicting dietary approaches out there, the science itself is actually fairly cohesive in most regards. This book does a lot of what we do here at 10almonds, and presents the science in a clear fashion without having any particular agenda to push.
The author is a nutritionist (BSc, MSc, RNutr) and therefore provides an up-to-date evidence-based approach for eating.
As a result, the only part of this book that brings it down in this reviewer’s opinion is the section on Intermittent Fasting. Being not strictly about nutrition, she has less expertise on that topic, and it shows.
The information is largely presented in double-page spreads each answering a particular question. Because of this, and the fact there are colorful graphic representations of information too, we do recommend the print version over Kindle*.
Bottom line: if you like the notion of real science being presented in a clear and simple fashion (we like to think our subscribers do!), then you’ll surely enjoy this book.
Click here to check out the Science of Nutrition, and get a clear overview!
*Writer’s note: I realize I’ve two days in a row recommended this (yesterday because there are checkboxes to check, worksheets to complete, etc), but it’s not a new trend; just how it happened to be with these two books. I love my Kindle dearly, but sometimes print has the edge for one reason or another!
Share This Post
Related Posts
-
Olive oil is healthy. Turns out olive leaf extract may be good for us too
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
Olive oil is synonymous with the Mediterranean diet, and the health benefits of both are well documented.
Olive oil reduces the risk of heart disease, cancer, diabetes and premature death. Olives also contain numerous healthy nutrients.
Now evidence is mounting about the health benefits of olive leaves, including from studies in a recent review.
Here’s what’s in olive leaves and who might benefit from taking olive leaf extract.
mtphoto19/Shutterstock What’s in olive leaves?
Olive leaves have traditionally been brewed as a tea in the Mediterranean and drunk to treat fever and malaria.
The leaves contain high levels of a type of antioxidant called oleuropein. Olives and olive oil contain this too, but at lower levels.
Generally, the greener the leaf (the less yellowish) the more oleuropein it contains. Leaves picked in spring also have higher levels compared to ones picked in autumn, indicating levels of oleuropein reduce as the leaves get older.
Olive leaves also contain other antioxidants such as hydroxytyrosol, luteolin, apigenin and verbascoside.
Antioxidants work by reducing the oxidative stress in the body. Oxidative stress causes damage to our DNA, cell membranes and tissues, which can lead to chronic diseases such as cancer and heart disease.
Are olive leaves healthy?
One review and analysis combined data from 12 experimental studies with 819 participants in total. Overall, olive leaf extract improved risk factors for heart disease. This included healthier blood lipids (fats) and lowering blood pressure.
The effect was greater for people who already had high blood pressure.
Most studies in this review gave olive leaf extract as a capsule, with daily doses of 500 milligrams to 5 grams for six to 48 weeks.
Another review and analysis published late last year looked at data from 12 experimental studies, with a total of 703 people. Some of these studies involved people with high blood lipids, people with high blood pressure, people who were overweight or obese, and some involved healthy people.
Daily doses were 250-1,000mg taken as tablets or baked into bread.
Individual studies in the review showed significant benefits in improving blood glucose (sugar) control, blood lipid levels and reducing blood pressure. But when all the data was combined, there were no significant health effects. We’ll explain why this may be the case shortly.
Olive leaves can be brewed into tea. Picture Partners/Shutterstock Another review looked at people who took oleuropein and hydroxytyrosol (the antioxidants in olive leaves). This found significant improvement in body weight, blood lipid profiles, glucose metabolism and improvements in bones, joints and cognitive function.
The individual studies included tested either the two antioxidants or olive leaf incorporated into foods such as bread and cooking oils (but not olive oil). The doses were 6-500mg per day of olive leaf extract.
So what can we make of these studies overall? They show olive leaf extract may help reduce blood pressure, improve blood lipids and help our bodies handle glucose.
But these studies show inconsistent results. This is likely due to differences in the way people took olive leaf extract, how much they took and how long for. This type of inconsistency normally tells us we need some more research to clarify the health effects of olive leaves.
Can you eat olive leaves?
Olive leaves can be brewed into a tea, or the leaves added to salads. Others report grinding olive leaves into smoothies.
However the leaves are bitter, because of the antioxidants, which can make them hard to eat, or the tea unpalatable.
Olive leaf extract has also been added to bread and other baked goods. Researchers find this improves the level of antioxidants in these products and people say the foods tasted better.
Olive leaves can taste bitter, which can put people off. But you can bake the extract into bread. Repina Valeriya/Shutterstock Is olive leaf extract toxic?
No, there seem to be no reported toxic effects of eating or drinking olive leaf extract.
It appears safe up to 1g a day, according to studies that have used olive leaf extract. However, there are no official guidelines about how much is safe to consume.
There have been reports of potential toxicity if taken over 85mg/kg of body weight per day. For an 80kg adult, this would mean 6.8g a day, well above the dose used in the studies mentioned in this article.
Pregnant and breastfeeding women are recommended not to consume it as we don’t know if it’s safe for them.
What should I do?
If you have high blood pressure, diabetes or raised blood lipids you may see some benefit from taking olive leaf extract. But it is important you discuss this with your doctor first and not change any medications or start taking olive leaf extract until you have spoken to them.
But there are plenty of antioxidants in all plant foods, and you should try to eat a wide variety of different coloured plant foods. This will allow you to get a range of nutrients and antioxidants.
Olive leaf and its extract is not going to be a panacea for your health if you’re not eating a healthy diet and following other health advice.
Evangeline Mantzioris, Program Director of Nutrition and Food Sciences, Accredited Practising Dietitian, University of South Australia
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Don’t Forget…
Did you arrive here from our newsletter? Don’t forget to return to the email to continue learning!
Learn to Age Gracefully
Join the 98k+ American women taking control of their health & aging with our 100% free (and fun!) daily emails:
-
War in Ukraine affected wellbeing worldwide, but people’s speed of recovery depended on their personality
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
The war in Ukraine has had impacts around the world. Supply chains have been disrupted, the cost of living has soared and we’ve seen the fastest-growing refugee crisis since World War II. All of these are in addition to the devastating humanitarian and economic impacts within Ukraine.
Our international team was conducting a global study on wellbeing in the lead up to and after the Russian invasion. This provided a unique opportunity to examine the psychological impact of the outbreak of war.
As we explain in a new study published in Nature Communications, we learned the toll on people’s wellbeing was evident across nations, not just in Ukraine. These effects appear to have been temporary – at least for the average person.
But people with certain psychological vulnerabilities struggled to recover from the shock of the war.
Tracking wellbeing during the outbreak of war
People who took part in our study completed a rigorous “experience-sampling” protocol. Specifically, we asked them to report their momentary wellbeing four times per day for a whole month.
Data collection began in October 2021 and continued throughout 2022. So we had been tracking wellbeing around the world during the weeks surrounding the outbreak of war in February 2022.
We also collected measures of personality, along with various sociodemographic variables (including age, gender, political views). This enabled us to assess whether different people responded differently to the crisis. We could also compare these effects across countries.
Our analyses focused primarily on 1,341 participants living in 17 European countries, excluding Ukraine itself (44,894 experience-sampling reports in total). We also expanded these analyses to capture the experiences of 1,735 people living in 43 countries around the world (54,851 experience-sampling reports) – including in Australia.
A global dip in wellbeing
On February 24 2022, the day Russia invaded Ukraine, there was a sharp decline in wellbeing around the world. There was no decline in the month leading up to the outbreak of war, suggesting the change in wellbeing was not already occurring for some other reason.
However, there was a gradual increase in wellbeing during the month after the Russian invasion, suggestive of a “return to baseline” effect. Such effects are commonly reported in psychological research: situations and events that impact our wellbeing often (though not always) do so temporarily.
Unsurprisingly, people in Europe experienced a sharper dip in wellbeing compared to people living elsewhere around the world. Presumably the war was much more salient for those closest to the conflict, compared to those living on an entirely different continent.
Interestingly, day-to-day fluctuations in wellbeing mirrored the salience of the war on social media as events unfolded. Specifically, wellbeing was lower on days when there were more tweets mentioning Ukraine on Twitter/X.
Our results indicate that, on average, it took around two months for people to return to their baseline levels of wellbeing after the invasion.
Different people, different recoveries
There are strong links between our wellbeing and our individual personalities.
However, the dip in wellbeing following the Russian invasion was fairly uniform across individuals. None of the individual factors assessed in our study, including personality and sociodemographic factors, predicted people’s response to the outbreak of war.
On the other hand, personality did play a role in how quickly people recovered. Individual differences in people’s recovery were linked to a personality trait called “stability”. Stability is a broad dimension of personality that combines low neuroticism with high agreeableness and conscientiousness (three traits from the Big Five personality framework).
Stability is so named because it reflects the stability of one’s overall psychological functioning. This can be illustrated by breaking stability down into its three components:
- low neuroticism describes emotional stability. People low in this trait experience less intense negative emotions such as anxiety, fear or anger, in response to negative events
- high agreeableness describes social stability. People high in this trait are generally more cooperative, kind, and motivated to maintain social harmony
- high conscientiousness describes motivational stability. People high in this trait show more effective patterns of goal-directed self-regulation.
So, our data show that people with less stable personalities fared worse in terms of recovering from the impact the war in Ukraine had on wellbeing.
In a supplementary analysis, we found the effect of stability was driven specifically by neuroticism and agreeableness. The fact that people higher in neuroticism recovered more slowly accords with a wealth of research linking this trait with coping difficulties and poor mental health.
These effects of personality on recovery were stronger than those of sociodemographic factors, such as age, gender or political views, which were not statistically significant.
Overall, our findings suggest that people with certain psychological vulnerabilities will often struggle to recover from the shock of global events such as the outbreak of war in Ukraine.
Luke Smillie, Professor in Personality Psychology, The University of Melbourne
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Don’t Forget…
Did you arrive here from our newsletter? Don’t forget to return to the email to continue learning!
Learn to Age Gracefully
Join the 98k+ American women taking control of their health & aging with our 100% free (and fun!) daily emails:
-
Could the shingles vaccine lower your risk of dementia?
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
A recent study has suggested Shingrix, a relatively new vaccine given to protect older adults against shingles, may delay the onset of dementia.
This might seem like a bizarre link, but actually, research has previously shown an older version of the shingles vaccine, Zostavax, reduced the risk of dementia.
In this new study, published last week in the journal Nature Medicine, researchers from the United Kingdom found Shingrix delayed dementia onset by 17% compared with Zostavax.
So how did the researchers work this out, and how could a shingles vaccine affect dementia risk?
Melinda Nagy/Shutterstock From Zostavax to Shingrix
Shingles is a viral infection caused by the varicella-zoster virus. It causes painful rashes, and affects older people in particular.
Previously, Zostavax was used to vaccinate against shingles. It was administered as a single shot and provided good protection for about five years.
Shingrix has been developed based on a newer vaccine technology, and is thought to offer stronger and longer-lasting protection. Given in two doses, it’s now the preferred option for shingles vaccination in Australia and elsewhere.
In November 2023, Shingrix replaced Zostavax on the National Immunisation Program, making it available for free to those at highest risk of complications from shingles. This includes all adults aged 65 and over, First Nations people aged 50 and older, and younger adults with certain medical conditions that affect their immune systems.
What the study found
Shingrix was approved by the US Food and Drugs Administration in October 2017. The researchers in the new study used the transition from Zostavax to Shingrix in the United States as an opportunity for research.
They selected 103,837 people who received Zostavax (between October 2014 and September 2017) and compared them with 103,837 people who received Shingrix (between November 2017 and October 2020).
By analysing data from electronic health records, they found people who received Shingrix had a 17% increase in “diagnosis-free time” during the follow-up period (up to six years after vaccination) compared with those who received Zostavax. This was equivalent to an average of 164 extra days without a dementia diagnosis.
The researchers also compared the shingles vaccines to other vaccines: influenza, and a combined vaccine for tetanus, diphtheria and pertussis. Shingrix and Zostavax performed around 14–27% better in lowering the risk of a dementia diagnosis, with Shingrix associated with a greater improvement.
The benefits of Shingrix in terms of dementia risk were significant for both sexes, but more pronounced for women. This is not entirely surprising, because we know women have a higher risk of developing dementia due to interplay of biological factors. These include being more sensitive to certain genetic mutations associated with dementia and hormonal differences.
Why the link?
The idea that vaccination against viral infection can lower the risk of dementia has been around for more than two decades. Associations have been observed between vaccines, such as those for diphtheria, tetanus, polio and influenza, and subsequent dementia risk.
Research has shown Zostavax vaccination can reduce the risk of developing dementia by 20% compared with people who are unvaccinated.
But it may not be that the vaccines themselves protect against dementia. Rather, it may be the resulting lack of viral infection creating this effect. Research indicates bacterial infections in the gut, as well as viral infections, are associated with a higher risk of dementia.
Notably, untreated infections with herpes simplex (herpes) virus – closely related to the varicella-zoster virus that causes shingles – can significantly increase the risk of developing dementia. Research has also shown shingles increases the risk of a later dementia diagnosis.
This isn’t the first time research has suggested a vaccine could reduce dementia risk. ben bryant/Shutterstock The mechanism is not entirely clear. But there are two potential pathways which may help us understand why infections could increase the risk of dementia.
First, certain molecules are produced when a baby is developing in the womb to help with the body’s development. These molecules have the potential to cause inflammation and accelerate ageing, so the production of these molecules is silenced around birth. However, viral infections such as shingles can reactivate the production of these molecules in adult life which could hypothetically lead to dementia.
Second, in Alzheimer’s disease, a specific protein called Amyloid-β go rogue and kill brain cells. Certain proteins produced by viruses such as COVID and bad gut bacteria have the potential to support Amyloid-β in its toxic form. In laboratory conditions, these proteins have been shown to accelerate the onset of dementia.
What does this all mean?
With an ageing population, the burden of dementia is only likely to become greater in the years to come. There’s a lot more we have to learn about the causes of the disease and what we can potentially do to prevent and treat it.
This new study has some limitations. For example, time without a diagnosis doesn’t necessarily mean time without disease. Some people may have underlying disease with delayed diagnosis.
This research indicates Shingrix could have a silent benefit, but it’s too early to suggest we can use antiviral vaccines to prevent dementia.
Overall, we need more research exploring in greater detail how infections are linked with dementia. This will help us understand the root causes of dementia and design potential therapies.
Ibrahim Javed, Enterprise and NHMRC Emerging Leadership Fellow, UniSA Clinical & Health Sciences, University of South Australia
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Don’t Forget…
Did you arrive here from our newsletter? Don’t forget to return to the email to continue learning!
Learn to Age Gracefully
Join the 98k+ American women taking control of their health & aging with our 100% free (and fun!) daily emails: