Castor Oil: All-Purpose Life-Changer, Or Snake Oil?
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
As “trending” health products go, castor oil is enjoying a lot of popularity presently, lauded as a life-changing miracle-worker, and social media is abuzz with advice to put it everywhere from your eyes to your vagina.
But:
- what things does science actually say it’s good for,
- what things lack evidence, and
- what things go into the category of “wow definitely do not do that”?
We don’t have the space to go into all of its proposed uses (there are simply far too many), but we’ll examine some common ones:
To heal/improve the skin barrier
Like most oils, it’s functional as a moisturizer. In particular, its high (90%!) ricinoleic fatty acid content does indeed make it good at that, and furthermore, has properties that can help reduce skin inflammation and promote wound healing:
Bioactive polymeric formulations for wound healing ← there isn’t a conveniently quotable summary we can just grab here, but you can see the data and results, from which we can conclude:
- formulations with ricinoleic acid (such as with castor oil) performed very well for topical anti-inflammatory purposes
- they avoided the unwanted side effects associated with some other contenders
- they consistently beat other preparations in the category of wound-healing
To support hair growth and scalp health
There is no evidence that it helps. We’d love to provide a citation for this, but it’s simply not there. There’s also no evidence that it doesn’t help. For whatever reason, despite its popularity, peer-reviewed science has simply not been done for this, or if it has, it wasn’t anywhere publicly accessible.
It’s possible that if a person is suffering hair loss specifically as a result of prostaglandin D2 levels, that ricinoleic acid will inhibit the PGD2, reversing the hair loss, but even this is hypothetical so far, as the science is currently only at the step before that:
However, due to some interesting chemistry, the combination of castor oil and warm water can result in acute (and irreversible) hair felting, in other words, the strands of hair suddenly glue together to become one mass which then has to be cut off:
“Castor Oil” – The Culprit of Acute Hair Felting
👆 this is a case study, which is generally considered a low standard of evidence (compared to high-quality Randomized Controlled Trials as the highest standard of evidence), but let’s just say, this writer (hi, it’s me) isn’t risking her butt-length hair on the off-chance, and doesn’t advise you to, either. There are other hair-oils out there; argan oil is great, coconut oil is totally fine too.
As a laxative
This time, there’s a lot of evidence, and it’s even approved for this purpose by the FDA, but it can be a bit too good, insofar as taking too much can result in diarrhea and uncomfortable cramping (the cramps are a feature not a bug; the mechanism of action is stimulatory, i.e. it gets the intestines squeezing, but again, it can result in doing that too much for comfort):
Castor Oil: FDA-Approved Indications
To soothe dry eyes
While putting oil in your eyes may seem dubious, this is another one where it actually works:
❝Castor oil is deemed safe and tolerable, with strong anti-microbial, anti-inflammatory, anti-nociceptive, analgesic, antioxidant, wound healing and vasoconstrictive properties.
These can supplement deficient physiological tear film lipids, enabling enhanced lipid spreading characteristics and reducing aqueous tear evaporation.
Studies reveal that castor oil applied topically to the ocular surface has a prolonged residence time, facilitating increased tear film lipid layer thickness, stability, improved ocular surface staining and symptoms.❞
Source: Therapeutic potential of castor oil in managing blepharitis, meibomian gland dysfunction and dry eye
Against candidiasis (thrush)
We couldn’t find science for (or against) castor oil’s use against vaginal candidiasis, but here’s a study that investigated its use against oral candidiasis:
…in which castor oil was the only preparation that didn’t work against the yeast.
Summary
We left a lot unsaid today (so many proposed uses, it feels like a shame to skip them), but in few words: it’s good for skin (including wound healing) and eyes; but we’d give it a miss for hair, candidiasis, and digestive disorders.
Want to try some?
We don’t sell it, but here for your convenience is an example product on Amazon 😎
Take care!
Don’t Forget…
Did you arrive here from our newsletter? Don’t forget to return to the email to continue learning!
Recommended
Learn to Age Gracefully
Join the 98k+ American women taking control of their health & aging with our 100% free (and fun!) daily emails:
-
Why scrapping the term ‘long COVID’ would be harmful for people with the condition
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
The assertion from Queensland’s chief health officer John Gerrard that it’s time to stop using the term “long COVID” has made waves in Australian and international media over recent days.
Gerrard’s comments were related to new research from his team finding long-term symptoms of COVID are similar to the ongoing symptoms following other viral infections.
But there are limitations in this research, and problems with Gerrard’s argument we should drop the term “long COVID”. Here’s why.
A bit about the research
The study involved texting a survey to 5,112 Queensland adults who had experienced respiratory symptoms and had sought a PCR test in 2022. Respondents were contacted 12 months after the PCR test. Some had tested positive to COVID, while others had tested positive to influenza or had not tested positive to either disease.
Survey respondents were asked if they had experienced ongoing symptoms or any functional impairment over the previous year.
The study found people with respiratory symptoms can suffer long-term symptoms and impairment, regardless of whether they had COVID, influenza or another respiratory disease. These symptoms are often referred to as “post-viral”, as they linger after a viral infection.
Gerrard’s research will be presented in April at the European Congress of Clinical Microbiology and Infectious Diseases. It hasn’t been published in a peer-reviewed journal.
After the research was publicised last Friday, some experts highlighted flaws in the study design. For example, Steven Faux, a long COVID clinician interviewed on ABC’s television news, said the study excluded people who were hospitalised with COVID (therefore leaving out people who had the most severe symptoms). He also noted differing levels of vaccination against COVID and influenza may have influenced the findings.
In addition, Faux pointed out the survey would have excluded many older people who may not use smartphones.
The authors of the research have acknowledged some of these and other limitations in their study.
Ditching the term ‘long COVID’
Based on the research findings, Gerrard said in a press release:
We believe it is time to stop using terms like ‘long COVID’. They wrongly imply there is something unique and exceptional about longer term symptoms associated with this virus. This terminology can cause unnecessary fear, and in some cases, hypervigilance to longer symptoms that can impede recovery.
But Gerrard and his team’s findings cannot substantiate these assertions. Their survey only documented symptoms and impairment after respiratory infections. It didn’t ask people how fearful they were, or whether a term such as long COVID made them especially vigilant, for example.
In discussing Gerrard’s conclusions about the terminology, Faux noted that even if only 3% of people develop long COVID (the survey found 3% of people had functional limitations after a year), this would equate to some 150,000 Queenslanders with the condition. He said:
To suggest that by not calling it long COVID you would be […] somehow helping those people not to focus on their symptoms is a curious conclusion from that study.
Another clinician and researcher, Philip Britton, criticised Gerrard’s conclusion about the language as “overstated and potentially unhelpful”. He noted the term “long COVID” is recognised by the World Health Organization as a valid description of the condition.
A cruel irony
An ever-growing body of research continues to show how COVID can cause harm to the body across organ systems and cells.
We know from the experiences shared by people with long COVID that the condition can be highly disabling, preventing them from engaging in study or paid work. It can also harm relationships with their friends, family members, and even their partners.
Despite all this, people with long COVID have often felt gaslit and unheard. When seeking treatment from health-care professionals, many people with long COVID report they have been dismissed or turned away.
Last Friday – the day Gerrard’s comments were made public – was actually International Long COVID Awareness Day, organised by activists to draw attention to the condition.
The response from people with long COVID was immediate. They shared their anger on social media about Gerrard’s comments, especially their timing, on a day designed to generate greater recognition for their illness.
Since the start of the COVID pandemic, patient communities have fought for recognition of the long-term symptoms many people faced.
The term “long COVID” was in fact coined by people suffering persistent symptoms after a COVID infection, who were seeking words to describe what they were going through.
The role people with long COVID have played in defining their condition and bringing medical and public attention to it demonstrates the possibilities of patient-led expertise. For decades, people with invisible or “silent” conditions such as ME/CFS (myalgic encephalomyelitis/chronic fatigue syndrome) have had to fight ignorance from health-care professionals and stigma from others in their lives. They have often been told their disabling symptoms are psychosomatic.
Gerrard’s comments, and the media’s amplification of them, repudiates the term “long COVID” that community members have chosen to give their condition an identity and support each other. This is likely to cause distress and exacerbate feelings of abandonment.
Terminology matters
The words we use to describe illnesses and conditions are incredibly powerful. Naming a new condition is a step towards better recognition of people’s suffering, and hopefully, better diagnosis, health care, treatment and acceptance by others.
The term “long COVID” provides an easily understandable label to convey patients’ experiences to others. It is well known to the public. It has been routinely used in news media reporting and and in many reputable medical journal articles.
Most importantly, scrapping the label would further marginalise a large group of people with a chronic illness who have often been left to struggle behind closed doors.
Deborah Lupton, SHARP Professor, Vitalities Lab, Centre for Social Research in Health and Social Policy Centre, and the ARC Centre of Excellence for Automated Decision-Making and Society, UNSW Sydney
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Share This Post
-
Spoon-Fed – by Dr. Tim Spector
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
Dr. Spector looks at widespread beliefs about food, and where those often scientifically disproven beliefs come from. Hint, there’s usually some manner of “follow the money”.
From calorie-counting to cholesterol content, from fish to bottled water, to why of all the people who self-report having an allergy, only around half turn out to actually have one when tested, Dr. Spector sets the record straight.
The style is as very down-to-earth and not at all self-aggrandizing; the author acknowledges his own mistakes and limitations along the way. In terms of pushing any particular agenda, his only agenda is clear: inform the public about bad science, so that we demand better science going forwards. Along the way, he gives us lots of information that can inform our personal health choices based on better science than indiscriminate headlines wildly (and sometimes intentionally) misinterpreting results.
Read this book, and you may find yourself clicking through to read the studies for yourself, next time you see a bold headline.
Bottom line: this book looks at a lot of what’s wrong with what a lot of people believe about healthy eating. Regular 10almonds readers might not find a lot that’s new here, but it could be a great gift for a would-be health-conscious friend or relative
Share This Post
-
When can my baby drink cow’s milk? It’s sooner than you think
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
Parents are often faced with well-meaning opinions and conflicting advice about what to feed their babies.
The latest guidance from the World Health Organization (WHO) recommends formula-fed babies can switch to cow’s milk from six months. Australian advice says parents should wait until 12 months. No wonder some parents, and the health professionals who advise them, are confused.
So what do parents need to know about the latest advice? And when is cow’s milk an option?
What’s the updated advice?
Last year, the WHO updated its global feeding guideline for children under two years old. This included recommending babies who are partially or totally formula fed can have whole animal milks (for example, full-fat cow’s milk) from six months.
This recommendation was made after a systematic review of research by WHO comparing the growth, health and development of babies fed infant formula from six months of age with those fed pasteurised or boiled animal milks.
The review found no evidence the growth and development of babies who were fed infant formula was any better than that of babies fed whole, fresh animal milks.
The review did find an increase in iron deficiency anaemia in babies fed fresh animal milk. However, WHO noted this could be prevented by giving babies iron-rich solid foods daily from six months.
On the strength of the available evidence, the WHO recommended babies fed infant formula, alone or in addition to breastmilk, can be fed animal milk or infant formula from six months of age.
The WHO said that animal milks fed to infants could include pasteurised full-fat fresh milk, reconstituted evaporated milk, fermented milk or yoghurt. But this should not include flavoured or sweetened milk, condensed milk or skim milk.
Why is this controversial?
Australian government guidelines recommend “cow’s milk should not be given as the main drink to infants under 12 months”. This seems to conflict with the updated WHO advice. However, WHO’s advice is targeted at governments and health authorities rather than directly at parents.
The Australian dietary guidelines are under review and the latest WHO advice is expected to inform that process.
OK, so how about iron?
Iron is an essential nutrient for everyone but it is particularly important for babies as it is vital for growth and brain development. Babies’ bodies usually store enough iron during the final few weeks of pregnancy to last until they are at least six months of age. However, if babies are born early (prematurely), if their umbilical cords are clamped too quickly or their mothers are anaemic during pregnancy, their iron stores may be reduced.
Cow’s milk is not a good source of iron. Most infant formula is made from cow’s milk and so has iron added. Breastmilk is also low in iron but much more of the iron in breastmilk is taken up by babies’ bodies than iron in cow’s milk.
Babies should not rely on milk (including infant formula) to supply iron after six months. So the latest WHO advice emphasises the importance of giving babies iron-rich solid foods from this age. These foods include:
- meat
- eggs
- vegetables, including beans and green leafy vegetables
- pulses, including lentils
- ground seeds and nuts (such as peanut or other nut butters, but with no added salt or sugar).
You may have heard that giving babies whole cow’s milk can cause allergies. In fact, whole cow’s milk is no more likely to cause allergies than infant formula based on cow’s milk.
What are my options?
The latest WHO recommendation that formula-fed babies can switch to cow’s milk from six months could save you money. Infant formula can cost more than five times more than fresh milk (A$2.25-$8.30 a litre versus $1.50 a litre).
For families who continue to use infant formula, it may be reassuring to know that if infant formula becomes hard to get due to a natural disaster or some other supply chain disruption fresh cow’s milk is fine to use from six months.
It is also important to know what has not changed in the latest feeding advice. WHO still recommends infants have only breastmilk for their first six months and then continue breastfeeding for up to two years or more. It is also still the case that infants under six months who are not breastfed or who need extra milk should be fed infant formula. Toddler formula for children over 12 months is not recommended.
All infant formula available in Australia must meet the same standard for nutritional composition and food safety. So, the cheapest infant formula is just as good as the most expensive.
What’s the take-home message?
The bottom line is your baby can safely switch from infant formula to fresh, full-fat cow’s milk from six months as part of a healthy diet with iron-rich foods. Likewise, cow’s milk can also be used to supplement or replace breastfeeding from six months, again alongside iron-rich foods.
If you have questions about introducing solids your GP, child health nurse or dietitian can help. If you need support with breastfeeding or starting solids you can call the National Breastfeeding Helpline (1800 686 268) or a lactation consultant.
Karleen Gribble, Adjunct Associate Professor, School of Nursing and Midwifery, Western Sydney University; Naomi Hull, PhD candidate, food security for infants and young children, University of Sydney, and Nina Jane Chad, Research Fellow, University of Sydney School of Public Health, University of Sydney
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Share This Post
Related Posts
-
Gluten: What’s The Truth?
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
Gluten: What’s The Truth?
We asked you for your health-related view of gluten, and got the above spread of results. To put it simply:
Around 60% of voters voted for “Gluten is bad if you have an allergy/sensitivity; otherwise fine”
The rest of the votes were split fairly evenly between the other three options:
- Gluten is bad for everyone and we should avoid it
- Gluten is bad if (and only if) you have Celiac disease
- Gluten is fine for all, and going gluten-free is a modern fad
First, let’s define some terms so that we’re all on the same page:
What is gluten?
Gluten is a category of protein found in wheat, barley, rye, and triticale. As such, it’s not one single compound, but a little umbrella of similar compounds. However, for the sake of not making this article many times longer, we’re going to refer to “gluten” without further specification.
What is Celiac disease?
Celiac disease is an autoimmune disease. Like many autoimmune diseases, we don’t know for sure how/why it occurs, but a combination of genetic and environmental factors have been strongly implicated, with the latter putatively including overexposure to gluten.
It affects about 1% of the world’s population, and people with Celiac disease will tend to respond adversely to gluten, notably by inflammation of the small intestine and destruction of enterocytes (the cells that line the wall of the small intestine). This in turn causes all sorts of other problems, beyond the scope of today’s main feature, but suffice it to say, it’s not pleasant.
What is an allergy/intolerance/sensitivity?
This may seem basic, but a lot of people conflate allergy/intolerance/sensitivity, so:
- An allergy is when the body mistakes a harmless substance for something harmful, and responds inappropriately. This can be mild (e.g. allergic rhinitis, hayfever) or severe (e.g. peanut allergy), and as such, responses can vary from “sniffly nose” to “anaphylactic shock and death”.
- In the case of a wheat allergy (for example), this is usually somewhere between the two, and can for example cause breathing problems after ingesting wheat or inhaling wheat flour.
- An intolerance is when the body fails to correctly process something it should be able to process, and just ejects it half-processed instead.
- A common and easily demonstrable example is lactose intolerance. There isn’t a well-defined analog for gluten, but gluten intolerance is nonetheless a well-reported thing.
- A sensitivity is when none of the above apply, but the body nevertheless experiences unpleasant symptoms after exposure to a substance that should normally be safe.
- In the case of gluten, this is referred to as non-Celiac gluten sensitivity
A word on scientific objectivity: at 10almonds we try to report science as objectively as possible. Sometimes people have strong feelings on a topic, especially if it is polarizing.
Sometimes people with a certain condition feel constantly disbelieved and mocked; sometimes people without a certain condition think others are imagining problems for themselves where there are none.
We can’t diagnose anyone or validate either side of that, but what we can do is report the facts as objectively as science can lay them out.
Gluten is fine for all, and going gluten-free is a modern fad: True or False?
Definitely False, Celiac disease is a real autoimmune disease that cannot be faked, and allergies are also a real thing that people can have, and again can be validated in studies. Even intolerances have scientifically measurable symptoms and can be tested against nocebo.
See for example:
- Epidemiology and clinical presentations of Celiac disease
- Severe forms of food allergy that can precipitate allergic emergencies
- Properties of gluten intolerance: gluten structure, evolution, and pathogenicity
However! It may not be a modern fad, so much as a modern genuine increase in incidence.
Widespread varieties of wheat today contain a lot more gluten than wheat of ages past, and many other molecular changes mean there are other compounds in modern grains that never even existed before.
However, the health-related impact of these (novel proteins and carbohydrates) is currently still speculative, and we are not in the business of speculating, so we’ll leave that as a “this hasn’t been studied enough to comment yet but we recognize it could potentially be a thing” factor.
Gluten is bad if (and only if) you have Celiac disease: True or False?
Definitely False; allergies for example are well-evidenced as real; same facts as we discussed/linked just above.
Gluten is bad for everyone and we should avoid it: True or False?
False, tentatively and contingently.
First, as established, there are people with clinically-evidenced Celiac disease, wheat allergy, or similar. Obviously, they should avoid triggering those diseases.
What about the rest of us, and what about those who have non-Celiac gluten sensitivity?
Clinical testing has found that of those reporting non-Celiac gluten sensitivity, nocebo-controlled studies validate that diagnosis in only a minority of cases.
In the following study, for example, only 16% of those reporting symptoms showed them in the trials, and 40% of those also showed a nocebo response (i.e., like placebo, but a bad rather than good effect):
This one, on the other hand, found that positive validations of diagnoses were found to be between 7% and 77%, depending on the trial, with an average of 30%:
Re-challenge Studies in Non-celiac Gluten Sensitivity: A Systematic Review and Meta-Analysis
In other words: non-Celiac gluten sensitivity is a thing, and/but may be over-reported, and/but may be in some part exacerbated by psychosomatic effect.
Note: psychosomatic effect does not mean “imagining it” or “all in your head”. Indeed, the “soma” part of the word “psychosomatic” has to do with its measurable effect on the rest of the body.
For example, while pain can’t be easily objectively measured, other things, like inflammation, definitely can.
As for everyone else? If you’re enjoying your wheat (or similar) products, it’s well-established that they should be wholegrain for the best health impact (fiber, a positive for your health, rather than white flour’s super-fast metabolites padding the liver and causing metabolic problems).
Wheat itself may have other problems, for example FODMAPs, amylase trypsin inhibitors, and wheat germ agglutinins, but that’s “a wheat thing” rather than “a gluten thing”.
That’s beyond the scope of today’s main feature, but you might want to check out today’s featured book!
For a final scientific opinion on this last one, though, here’s what a respected academic journal of gastroenterology has to say:
From coeliac disease to noncoeliac gluten sensitivity; should everyone be gluten-free?
Don’t Forget…
Did you arrive here from our newsletter? Don’t forget to return to the email to continue learning!
Learn to Age Gracefully
Join the 98k+ American women taking control of their health & aging with our 100% free (and fun!) daily emails:
-
Oral vaccines could provide relief for people who suffer regular UTIs. Here’s how they work
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
In a recent TikTok video, Australian media personality Abbie Chatfield shared she was starting a vaccine to protect against urinary tract infections (UTIs).
Huge news for the UTI girlies. I am starting a UTI vaccine tonight for the first time.
Chatfield suffers from recurrent UTIs and has turned to the Uromune vaccine, an emerging option for those seeking relief beyond antibiotics.
But Uromune is not a traditional vaccine injected to your arm. So what is it and how does it work?
First, what are UTIs?
UTIs are caused by bacteria entering the urinary system. This system includes the kidneys, bladder, ureters (thin tubes connecting the kidneys to the bladder), and the urethra (the tube through which urine leaves the body).
The most common culprit is Escherichia coli (E. coli), a type of bacteria normally found in the intestines.
While most types of E. coli are harmless in the gut, it can cause infection if it enters the urinary tract. UTIs are particularly prevalent in women due to their shorter urethras, which make it easier for bacteria to reach the bladder.
Roughly 50% of women will experience at least one UTI in their lifetime, and up to half of those will have a recurrence within six months.
The symptoms of a UTI typically include a burning sensation when you wee, frequent urges to go even when the bladder is empty, cloudy or strong-smelling urine, and pain or discomfort in the lower abdomen or back. If left untreated, a UTI can escalate into a kidney infection, which can require more intensive treatment.
While antibiotics are the go-to treatment for UTIs, the rise of antibiotic resistance and the fact many people experience frequent reinfections has sparked more interest in preventive options, including vaccines.
What is Uromune?
Uromune is a bit different to traditional vaccines that are injected into the muscle. It’s a sublingual spray, which means you spray it under your tongue. Uromune is generally used daily for three months.
It contains inactivated forms of four bacteria that are responsible for most UTIs, including E. coli. By introducing these bacteria in a controlled way, it helps your immune system learn to recognise and fight them off before they cause an infection. It can be classified as an immunotherapy.
A recent study involving 1,104 women found the Uromune vaccine was 91.7% effective at reducing recurrent UTIs after three months, with effectiveness dropping to 57.6% after 12 months.
These results suggest Uromune could provide significant (though time-limited) relief for women dealing with frequent UTIs, however peer-reviewed research remains limited.
Any side effects of Uromune are usually mild and may include dry mouth, slight stomach discomfort, and nausea. These side effects typically go away on their own and very few people stop treatment because of them. In rare cases, some people may experience an allergic reaction.
How can I access it?
In Australia, Uromune has not received full approval from the Therapeutic Goods Administration (TGA), and so it’s not something you can just go and pick up from the pharmacy.
However, Uromune can be accessed via the TGA’s Special Access Scheme or the Authorised Prescriber pathway. This means a GP or specialist can apply for approval to prescribe Uromune for patients with recurrent UTIs. Once the patient has a form from their doctor documenting this approval, they can order the vaccine directly from the manufacturer.
Uromune is not covered under the Pharmaceutical Benefits Scheme, meaning patients must cover the full cost out-of-pocket. The cost of a treatment program is around A$320.
Uromune is similarly available through special access programs in places like the United Kingdom and Europe.
Other options in the pipeline
In addition to Uromune, scientists are exploring other promising UTI vaccines.
Uro-Vaxom is an established immunomodulator, a substance that helps regulate or modify the immune system’s response to bacteria. It’s derived from E. coli proteins and has shown success in reducing UTI recurrences in several studies. Uro-Vaxom is typically prescribed as a daily oral capsule taken for 90 days.
FimCH, another vaccine in development, targets something called the adhesin protein that helps E. coli attach to urinary tract cells. FimCH is typically administered through an injection and early clinical trials have shown promising results.
Meanwhile, StroVac, which is already approved in Germany, contains inactivated strains of bacteria such as E. coli and provides protection for up to 12 months, requiring a booster dose after that. This injection works by stimulating the immune system in the bladder, offering temporary protection against recurrent infections.
These vaccines show promise, but challenges like achieving long-term immunity remain. Research is ongoing to improve these options.
No magic bullet, but there’s reason for optimism
While vaccines such as Uromune may not be an accessible or perfect solution for everyone, they offer real hope for people tired of recurring UTIs and endless rounds of antibiotics.
Although the road to long-term relief might still be a bit bumpy, it’s exciting to see innovative treatments like these giving people more options to take control of their health.
Iris Lim, Assistant Professor in Biomedical Science, Bond University
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Don’t Forget…
Did you arrive here from our newsletter? Don’t forget to return to the email to continue learning!
Learn to Age Gracefully
Join the 98k+ American women taking control of their health & aging with our 100% free (and fun!) daily emails:
-
Hoisin Sauce vs Teriyaki Sauce – Which is Healthier?
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
Our Verdict
When comparing hoisin sauce to teriyaki sauce, we picked the teriyaki sauce.
Why?
Neither are great! But spoonful for spoonful, the hoisin sauce has about 5x as much sugar.
Of course, exact amounts will vary by brand, but the hoisin will invariably be much more sugary than the teriyaki.
On the flipside, the teriyaki sauce may sometimes have slightly more salt, but they are usually in approximately the same ballpark of saltiness, so this is not a big deciding factor.
As a general rule of thumb, the first few ingredients will look like this for each, respectively:
Hoisin:
- Sugar
- Water
- Soybeans
Teriyaki:
- Soy sauce (water, soybeans, salt)
- Rice wine
- Sugar
In essence: hoisin is a soy-flavored syrup, while teriyaki is a sweetened soy sauce
Wondering about that rice wine? The alcohol content is negligible, sufficiently so that teriyaki sauce is not considered alcoholic. For health purposes, it is well under the 0.05% required to be considered alcohol-free.
For religious purposes, we are not your rabbi or imam, but to our best understanding, teriyaki sauce is generally considered kosher* (the rice wine being made from rice) and halal (the rice wine being de-alcoholized by the processing, making the sauce non-intoxicating).
Want to try some?
You can compare these examples side-by-side yourself:
Enjoy!
Don’t Forget…
Did you arrive here from our newsletter? Don’t forget to return to the email to continue learning!
Learn to Age Gracefully
Join the 98k+ American women taking control of their health & aging with our 100% free (and fun!) daily emails: