What Weston Price Got Right (And Wrong)
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
Weston Price: What Stood The Test of Time?
This is Dr. Weston Price, a dentist. You may guess from the photo, or perhaps already knew, his work is not new in 2023. We usually feature current health experts here, but we’re taking a day to do a blast from the past, because his ideas endure today, and inform a lot of people’s health views. So, he’s a good one to at least know about.
What was his deal?
Dr. Price (1870–1948) wanted to study focal infection theory—the idea that repairing root canals allowed bacterial infections that caused everything from heart disease to arthritis. His solution was that the teeth should be extracted instead.
This theory was popular in the 1920s, was challenged in the 1930s, ignored in the 1940s (the world was a bit busy), and by broad medical consensus anyway, rejected in the 1950s. But, while it was being challenged in the 1930s, Dr. Price decided to find more evidence for its support.
The result was his famous world tour of peoples living traditional lifestyles without the influence of “modern” diet. His findings, and the conclusions he drew from them, extended to far more than just dental health.
What did he find?
Dr. Price found that people living traditional lifestyles, with their traditional diets based on locally-sourced foods, had much better overall health. Of course, he was a dentist and not a general practitioner, so aside from examining their teeth, he largely relied on self-reported diagnoses of illness, or lack thereof.
In short: he found that people in places without modern medical institutions had fewer diagnoses of disease. From this, he concluded that incidence of disease was much lower.
There was also an unexamined element of survivorship bias—an undiagnosed disease is more likely to be fatal, and he questioned only living people, which skewed the stats rather. Nor did he examine infant mortality rate nor adult life expectancy, both of which were not great.
Was it all useless, then?
Actually no! He did hit upon some observations that have stood the test of time:
- He correctly concluded that modern diets with sugar and white flour were ruinous to the health.
- He correctly concluded that locally-sourced food, and grass-fed in the case of pastoral farming, tended to have much more nutritional value than the mass-produced results of intensive farming.
- He correctly concluded that many modern preservation methods robbed foods of their nutrients.
- He correctly concluded that many grains and seeds are more nutritions when fermented/soaked/sprouted.
About that “locally-sourced food”: the reason locally-sourced food tends to be more nutritious is that it has required less in the way of preservation for a long trip around the world, and will also tend to be fresher.
On the other hand, this does mean a lot of the foods that Dr. Price recommends are very much subject to availability. It may well be true that the Inuit people do not eat a lot of fruit and veg (which mostly do not grow there), but if you live in Nevada, maybe locally-sourced whale fat is just as difficult to find.
One person’s “this fatty organ meat contains the vitamin C we need” may be another person’s “that’s great; I have an apple tree in my garden though”.
Want to learn more?
Dr. Price’s most influential work is his magnum opus, “Nutrition and Physical Degeneration”. It’s a fascinating book in its historical context, but do be warned, it was written by a rich white man in 1939 and the writing is as racist as you might expect. Even when making favourable comparisons, the tone is very much “and here is what these savages are doing well”.
If you don’t fancy reading all that, here are two other sources about Weston Price’s work and conclusions, presented for balance:
- The Weston A. Price Foundation (Official Website)
- Weston Price’s Appalling Legacy (Science-Based Medicine.org)
Enjoy!
Don’t Forget…
Did you arrive here from our newsletter? Don’t forget to return to the email to continue learning!
Recommended
Learn to Age Gracefully
Join the 98k+ American women taking control of their health & aging with our 100% free (and fun!) daily emails:
-
How do science journalists decide whether a psychology study is worth covering?
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
Complex research papers and data flood academic journals daily, and science journalists play a pivotal role in disseminating that information to the public. This can be a daunting task, requiring a keen understanding of the subject matter and the ability to translate dense academic language into narratives that resonate with the general public.
Several resources and tip sheets, including the Know Your Research section here at The Journalist’s Resource, aim to help journalists hone their skills in reporting on academic research.
But what factors do science journalists look for to decide whether a social science research study is trustworthy and newsworthy? That’s the question researchers at the University of California, Davis, and the University of Melbourne in Australia examine in a recent study, “How Do Science Journalists Evaluate Psychology Research?” published in September in Advances in Methods and Practices in Psychological Science.
Their online survey of 181 mostly U.S.-based science journalists looked at how and whether they were influenced by four factors in fictitious research summaries: the sample size (number of participants in the study), sample representativeness (whether the participants in the study were from a convenience sample or a more representative sample), the statistical significance level of the result (just barely statistically significant or well below the significance threshold), and the prestige of a researcher’s university.
The researchers found that sample size was the only factor that had a robust influence on journalists’ ratings of how trustworthy and newsworthy a study finding was.
University prestige had no effect, while the effects of sample representativeness and statistical significance were inconclusive.
But there’s nuance to the findings, the authors note.
“I don’t want people to think that science journalists aren’t paying attention to other things, and are only paying attention to sample size,” says Julia Bottesini, an independent researcher, a recent Ph.D. graduate from the Psychology Department at UC Davis, and the first author of the study.
Overall, the results show that “these journalists are doing a very decent job” vetting research findings, Bottesini says.
Also, the findings from the study are not generalizable to all science journalists or other fields of research, the authors note.
“Instead, our conclusions should be circumscribed to U.S.-based science journalists who are at least somewhat familiar with the statistical and replication challenges facing science,” they write. (Over the past decade a series of projects have found that the results of many studies in psychology and other fields can’t be reproduced, leading to what has been called a ‘replication crisis.’)
“This [study] is just one tiny brick in the wall and I hope other people get excited about this topic and do more research on it,” Bottesini says.
More on the study’s findings
The study’s findings can be useful for researchers who want to better understand how science journalists read their research and what kind of intervention — such as teaching journalists about statistics — can help journalists better understand research papers.
“As an academic, I take away the idea that journalists are a great population to try to study because they’re doing something really important and it’s important to know more about what they’re doing,” says Ellen Peters, director of Center for Science Communication Research at the School of Journalism and Communication at the University of Oregon. Peters, who was not involved in the study, is also a psychologist who studies human judgment and decision-making.
Peters says the study was “overall terrific.” She adds that understanding how journalists do their work “is an incredibly important thing to do because journalists are who reach the majority of the U.S. with science news, so understanding how they’re reading some of our scientific studies and then choosing whether to write about them or not is important.”
The study, conducted between December 2020 and March 2021, is based on an online survey of journalists who said they at least sometimes covered science or other topics related to health, medicine, psychology, social sciences, or well-being. They were offered a $25 Amazon gift card as compensation.
Among the participants, 77% were women, 19% were men, 3% were nonbinary and 1% preferred not to say. About 62% said they had studied physical or natural sciences at the undergraduate level, and 24% at the graduate level. Also, 48% reported having a journalism degree. The study did not include the journalists’ news reporting experience level.
Participants were recruited through the professional network of Christie Aschwanden, an independent journalist and consultant on the study, which could be a source of bias, the authors note.
“Although the size of the sample we obtained (N = 181) suggests we were able to collect a range of perspectives, we suspect this sample is biased by an ‘Aschwanden effect’: that science journalists in the same professional network as C. Aschwanden will be more familiar with issues related to the replication crisis in psychology and subsequent methodological reform, a topic C. Aschwanden has covered extensively in her work,” they write.
Participants were randomly presented with eight of 22 one-paragraph fictitious social and personality psychology research summaries with fictitious authors. The summaries are posted on Open Science Framework, a free and open-source project management tool for researchers by the Center for Open Science, with a mission to increase openness, integrity and reproducibility of research.
For instance, one of the vignettes reads:
“Scientists at Harvard University announced today the results of a study exploring whether introspection can improve cooperation. 550 undergraduates at the university were randomly assigned to either do a breathing exercise or reflect on a series of questions designed to promote introspective thoughts for 5 minutes. Participants then engaged in a cooperative decision-making game, where cooperation resulted in better outcomes. People who spent time on introspection performed significantly better at these cooperative games (t (548) = 3.21, p = 0.001). ‘Introspection seems to promote better cooperation between people,’ says Dr. Quinn, the lead author on the paper.”
In addition to answering multiple-choice survey questions, participants were given the opportunity to answer open-ended questions, such as “What characteristics do you [typically] consider when evaluating the trustworthiness of a scientific finding?”
Bottesini says those responses illuminated how science journalists analyze a research study. Participants often mentioned the prestige of the journal in which it was published or whether the study had been peer-reviewed. Many also seemed to value experimental research designs over observational studies.
Considering statistical significance
When it came to considering p-values, “some answers suggested that journalists do take statistical significance into account, but only very few included explanations that suggested they made any distinction between higher or lower p values; instead, most mentions of p values suggest journalists focused on whether the key result was statistically significant,” the authors write.
Also, many participants mentioned that it was very important to talk to outside experts or researchers in the same field to get a better understanding of the finding and whether it could be trusted, the authors write.
“Journalists also expressed that it was important to understand who funded the study and whether the researchers or funders had any conflicts of interest,” they write.
Participants also “indicated that making claims that were calibrated to the evidence was also important and expressed misgivings about studies for which the conclusions do not follow from the evidence,” the authors write.
In response to the open-ended question, “What characteristics do you [typically] consider when evaluating the trustworthiness of a scientific finding?” some journalists wrote they checked whether the study was overstating conclusions or claims. Below are some of their written responses:
- “Is the researcher adamant that this study of 40 college kids is representative? If so, that’s a red flag.”
- “Whether authors make sweeping generalizations based on the study or take a more measured approach to sharing and promoting it.”
- “Another major point for me is how ‘certain’ the scientists appear to be when commenting on their findings. If a researcher makes claims which I consider to be over-the-top about the validity or impact of their findings, I often won’t cover.”
- “I also look at the difference between what an experiment actually shows versus the conclusion researchers draw from it — if there’s a big gap, that’s a huge red flag.”
Peters says the study’s findings show that “not only are journalists smart, but they have also gone out of their way to get educated about things that should matter.”
What other research shows about science journalists
A 2023 study, published in the International Journal of Communication, based on an online survey of 82 U.S. science journalists, aims to understand what they know and think about open-access research, including peer-reviewed journals and articles that don’t have a paywall, and preprints. Data was collected between October 2021 and February 2022. Preprints are scientific studies that have yet to be peer-reviewed and are shared on open repositories such as medRxiv and bioRxiv. The study finds that its respondents “are aware of OA and related issues and make conscious decisions around which OA scholarly articles they use as sources.”
A 2021 study, published in the Journal of Science Communication, looks at the impact of the COVID-19 pandemic on the work of science journalists. Based on an online survey of 633 science journalists from 77 countries, it finds that the pandemic somewhat brought scientists and science journalists closer together. “For most respondents, scientists were more available and more talkative,” the authors write. The pandemic has also provided an opportunity to explain the scientific process to the public, and remind them that “science is not a finished enterprise,” the authors write.
More than a decade ago, a 2008 study, published in PLOS Medicine, and based on an analysis of 500 health news stories, found that “journalists usually fail to discuss costs, the quality of the evidence, the existence of alternative options, and the absolute magnitude of potential benefits and harms,” when reporting on research studies. Giving time to journalists to research and understand the studies, giving them space for publication and broadcasting of the stories, and training them in understanding academic research are some of the solutions to fill the gaps, writes Gary Schwitzer, the study author.
Advice for journalists
We asked Bottesini, Peters, Aschwanden and Tamar Wilner, a postdoctoral fellow at the University of Texas, who was not involved in the study, to share advice for journalists who cover research studies. Wilner is conducting a study on how journalism research informs the practice of journalism. Here are their tips:
1. Examine the study before reporting it.
Does the study claim match the evidence? “One thing that makes me trust the paper more is if their interpretation of the findings is very calibrated to the kind of evidence that they have,” says Bottesini. In other words, if the study makes a claim in its results that’s far-fetched, the authors should present a lot of evidence to back that claim.
Not all surprising results are newsworthy. If you come across a surprising finding from a single study, Peters advises you to step back and remember Carl Sagan’s quote: “Extraordinary claims require extraordinary evidence.”
How transparent are the authors about their data? For instance, are the authors posting information such as their data and the computer codes they use to analyze the data on platforms such as Open Science Framework, AsPredicted, or The Dataverse Project? Some researchers ‘preregister’ their studies, which means they share how they’re planning to analyze the data before they see them. “Transparency doesn’t automatically mean that a study is trustworthy,” but it gives others the chance to double-check the findings, Bottesini says.
Look at the study design. Is it an experimental study or an observational study? Observational studies can show correlations but not causation.
“Observational studies can be very important for suggesting hypotheses and pointing us towards relationships and associations,” Aschwanden says.
Experimental studies can provide stronger evidence toward a cause, but journalists must still be cautious when reporting the results, she advises. “If we end up implying causality, then once it’s published and people see it, it can really take hold,” she says.
Know the difference between preprints and peer-reviewed, published studies. Peer-reviewed papers tend to be of higher quality than those that are not peer-reviewed. Read our tip sheet on the difference between preprints and journal articles.
Beware of predatory journals. Predatory journals are journals that “claim to be legitimate scholarly journals, but misrepresent their publishing practices,” according to a 2020 journal article, published in the journal Toxicologic Pathology, “Predatory Journals: What They Are and How to Avoid Them.”
2. Zoom in on data.
Read the methods section of the study. The methods section of the study usually appears after the introduction and background section. “To me, the methods section is almost the most important part of any scientific paper,” says Aschwanden. “It’s amazing to me how often you read the design and the methods section, and anyone can see that it’s a flawed design. So just giving things a gut-level check can be really important.”
What’s the sample size? Not all good studies have large numbers of participants but pay attention to the claims a study makes with a small sample size. “If you have a small sample, you calibrate your claims to the things you can tell about those people and don’t make big claims based on a little bit of evidence,” says Bottesini.
But also remember that factors such as sample size and p-value are not “as clear cut as some journalists might assume,” says Wilner.
How representative of a population is the study sample? “If the study has a non-representative sample of, say, undergraduate students, and they’re making claims about the general population, that’s kind of a red flag,” says Bottesini. Aschwanden points to the acronym WEIRD, which stands for “Western, Educated, Industrialized, Rich, and Democratic,” and is used to highlight a lack of diversity in a sample. Studies based on such samples may not be generalizable to the entire population, she says.
Look at the p-value. Statistical significance is both confusing and controversial, but it’s important to consider. Read our tip sheet, “5 Things Journalists Need to Know About Statistical Significance,” to better understand it.
3. Talk to scientists not involved in the study.
If you’re not sure about the quality of a study, ask for help. “Talk to someone who is an expert in study design or statistics to make sure that [the study authors] use the appropriate statistics and that methods they use are appropriate because it’s amazing to me how often they’re not,” says Aschwanden.
Get an opinion from an outside expert. It’s always a good idea to present the study to other researchers in the field, who have no conflicts of interest and are not involved in the research you’re covering and get their opinion. “Don’t take scientists at their word. Look into it. Ask other scientists, preferably the ones who don’t have a conflict of interest with the research,” says Bottesini.
4. Remember that a single study is simply one piece of a growing body of evidence.
“I have a general rule that a single study doesn’t tell us very much; it just gives us proof of concept,” says Peters. “It gives us interesting ideas. It should be retested. We need an accumulation of evidence.”
Aschwanden says as a practice, she tries to avoid reporting stories about individual studies, with some exceptions such as very large, randomized controlled studies that have been underway for a long time and have a large number of participants. “I don’t want to say you never want to write a single-study story, but it always needs to be placed in the context of the rest of the evidence that we have available,” she says.
Wilner advises journalists to spend some time looking at the scope of research on the study’s specific topic and learn how it has been written about and studied up to that point.
“We would want science journalists to be reporting balance of evidence, and not focusing unduly on the findings that are just in front of them in a most recent study,” Wilner says. “And that’s a very difficult thing to as journalists to do because they’re being asked to make their article very newsy, so it’s a difficult balancing act, but we can try and push journalists to do more of that.”
5. Remind readers that science is always changing.
“Science is always two steps forward, one step back,” says Peters. Give the public a notion of uncertainty, she advises. “This is what we know today. It may change tomorrow, but this is the best science that we know of today.”
Aschwanden echoes the sentiment. “All scientific results are provisional, and we need to keep that in mind,” she says. “It doesn’t mean that we can’t know anything, but it’s very important that we don’t overstate things.”
Authors of a study published in PNAS in January analyzed more than 14,000 psychology papers and found that replication success rates differ widely by psychology subfields. That study also found that papers that could not be replicated received more initial press coverage than those that could.
The authors note that the media “plays a significant role in creating the public’s image of science and democratizing knowledge, but it is often incentivized to report on counterintuitive and eye-catching results.”
Ideally, the news media would have a positive relationship with replication success rates in psychology, the authors of the PNAS study write. “Contrary to this ideal, however, we found a negative association between media coverage of a paper and the paper’s likelihood of replication success,” they write. “Therefore, deciding a paper’s merit based on its media coverage is unwise. It would be valuable for the media to remind the audience that new and novel scientific results are only food for thought before future replication confirms their robustness.”
Additional reading
Uncovering the Research Behaviors of Reporters: A Conceptual Framework for Information Literacy in Journalism
Katerine E. Boss, et al. Journalism & Mass Communication Educator, October 2022.The Problem with Psychological Research in the Media
Steven Stosny. Psychology Today, September 2022.Critically Evaluating Claims
Megha Satyanarayana, The Open Notebook, January 2022.How Should Journalists Report a Scientific Study?
Charles Binkley and Subramaniam Vincent. Markkula Center for Applied Ethics at Santa Clara University, September 2020.What Journalists Get Wrong About Social Science: Full Responses
Brian Resnick. Vox, January 2016.From The Journalist’s Resource
8 Ways Journalists Can Access Academic Research for Free
5 Things Journalists Need to Know About Statistical Significance
5 Common Research Designs: A Quick Primer for Journalists
5 Tips for Using PubPeer to Investigate Scientific Research Errors and Misconduct
What’s Standard Deviation? 4 Things Journalists Need to Know
This article first appeared on The Journalist’s Resource and is republished here under a Creative Commons license.
Share This Post
-
Tinnitus: Quieting The Unwanted Orchestra In Your Ears
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
Tinnitus—When a “minor” symptom becomes disruptive
Tinnitus (typically: ringing in the ears) is often thought of less as a condition in and of itself, and more a symptom related to other hearing-related conditions. Paradoxically, it can be associated with hearing loss as well as with hyperacusis (hearing supersensitivity, which sounds like a superpower, but can be quite a problem too).
More than just ringing
Tinnitus can manifest not just as ringing, but also as whistling, hissing, pulsing, buzzing, hooting, and more.
For those who don’t suffer from this, it can seem very trivial; for those who do… Sometimes it can seem trivial too!
But sometimes it’s hard to carry on a conversation when at random moments it suddenly sounds like someone is playing a slide-whistle directly into your earhole, or like maybe a fly got stuck in there.
It’s distracting, to say the least.
What causes it?
First let’s note, tinnitus can be acute or chronic. So, some of these things may just cause tinnitus for a while, whereas some may give you tinnitus for life. In some cases, it depends on how long the thing in question persisted for.
A lot of things can cause it, but common causes include:
- Noise exposure (e.g. concerts, some kinds of industrial work, war)
- High blood pressure
- Head/neck injuries
- Ear infection
- Autoimmune diseases (e.g. Type 1 Diabetes, Lupus, Multiple Sclerosis)
So what can be done about it?
Different remedies will work (or not) for different people, depending on the cause and type of tinnitus.
Be warned also: some things that will work for one person’s tinnitus will make another person’s worse, so you might need to try a degree of experimentation and some of it might not be fun!
That in mind, here are some things you might want to try if you haven’t already:
- Earplugs or noise-canceling headphones—while tinnitus is an internal sound, not external, it often has to do with some part(s) of your ears being unduly sensitive, so giving them less stimulus may ease the tinnitus that occurs in reaction to external noise.
- A great option (that this writer uses personally and considers a life-changer) is silicon earplugs that live in a little case on a keyring when not in use—no more heart-racing fleeing from supermarket checkout boops or pedestrian crossing bips or traffic noises or babies crying or (etc)
- White noise—if you also have hyperacusis, a lower frequency range will probably not hurt the way a higher range might. If you don’t also have hyperacusis, you have more options here and this is a popular remedy. Either way, white noise outperforms “relaxing” soundscapes.
- Hearing aids—counterintuitively, for some people whose tinnitus has developed in response to hearing loss, hearing aids can help bring things “back to normal” and eliminate tinnitus in the process.
- Customized sound machines—if you have the resources to get fancy, science currently finds this to be best of all. They work like white noise, but are tailored to your specific tinnitus.
Share This Post
-
White Potato vs Sweet Potato – Which is Healthier?
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
Our Verdict
When comparing white potatoes to sweet potatoes, we picked the sweet potatoes.
Why?
In terms of macros, sweet potatoes are a little lighter on carbs and calories, though in the case of sugar and fiber, sweet potato has a few grams more of each, per potato. However, when an average sweet potato’s 7g of sugar are held against its 4g of fiber, this (much like with fruit!) not a sugar you need to avoid.
See also: Which Sugars Are Healthier, And Which Are Just The Same?
The glycemic index of a sweet potato is also lower than that of a white potato, so the sugars it does have are slower-release.
Sweet potatoes famously are good sources of vitamin A and beta-carotene, which important nutrients white potatoes cannot boast.
Both plants are equally good sources of potassium and vitamin C.
Summary
Both are good sources of many nutrients, and any nutritional health-hazards associated with them come with the preparation (for example, frying introduces unhealthy fats, and mashing makes the glycemic index skyrocket, and cooking with salt increases the salt content).
Baking either is great (consider stuffing them with delicious well-seasoned beans and/or tomatoes; if you make it yourself, pesto can be a great option too, as can cheese if you’re so-inclined and judicious with choice and quantity) and preserves almost all of their nutrients. Remember that nearly 100% of the fiber is in the skin, so you do want to eat that.
The deciding factor is: sweet potatoes are good sources of a couple more valuable nutrients that white potatoes aren’t, and come out as the overall healthiest for that reason.
Enjoy!
Share This Post
Related Posts
-
Is thirst a good predictor of dehydration?
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
Water is essential for daily functioning and health, and we can only survive a few days without it. Yet we constantly lose water through sweat, urination and even evaporation when we breathe.
This is why we have evolved a way to regulate and maintain water in our bodies. Like other animals, our survival relies on a strong biological drive that tells us to find and drink water to balance fluid loss.
This is thirst – a sensation of dryness in the mouth signalling we need to have a drink. This basic physiological mechanism is controlled mainly by part of the brain’s “control centre”, called the hypothalamus. The hypothalamus receives signals from various regions of the body and in return, releases hormones that act as a messenger to signal the thirst sensation.
What is dehydration?
Staying hydrated (having enough water in our bodies) is important for several reasons, including:
- regulating body temperature through sweat and respiration
- lubricating joints and eyes
- preventing infections
- digesting and absorbing nutrients
- flushing out waste (via the kidneys)
- preventing constipation
- brain function (including memory and concentration)
- mood and energy levels
- physical performance and recovery from exercise
- skin health.
Dehydration occurs when our body doesn’t have enough water. Even slight drops in fluid levels have noticeable consequences, such as headaches, feeling dizzy, lethargy and struggling to concentrate.
Chronic dehydration can pose more serious health risks, including urinary tract infections, constipation and kidney stones.
What does the evidence say?
Despite thirst being one of the most basic biological drivers for good hydration, science suggests our feelings of thirst and subsequent fluid intake don’t always correlate with hydration levels.
For example, a recent study explored the impact of thirst on fluid intake and hydration status. Participants attended a lab in the morning and then later in the afternoon to provide markers of hydration status (such as urine, blood samples and body weight). The relationship between levels of thirst in the morning and afternoon hydration status was negligible.
Further, thirst may be driven by environmental factors, such as access to water. For example, one study looked at whether ample access to water in a lab influenced how much people drank and how hydrated they were. The link between how thirsty they felt and how hydrated they were was weak, suggesting the availability of water influenced their fluid intake more than thirst.
Exercise can also change our thirst mechanism, though studies are limited at this stage.
Interestingly, research shows women experience thirst more strongly than men, regardless of hydration status. To understand gender differences in thirst, researchers infused men and women with fluids and then measured their thirst and how hydrated they were. They found women generally reported thirst at a lower level of fluid loss. Women have also been found to respond more to feeling thirsty by drinking more water.
Other ways to tell if you need to drink some water
While acknowledging some people will need to drink more or less, for many people, eight cups (or two litres) a day is a good amount of water to aim for.
But beyond thirst, there are many other ways to tell whether you might need to drink more water.
1. urine colour: pale yellow urine typically indicates good hydration, while darker, concentrated urine suggests dehydration
2. frequency of going to the toilet: urinating regularly (around four to six times a day) indicates good hydration. Infrequent urination can signal dehydration
3. skin turgor test: gently pinching the skin (for example, on the back of the hand) and observing how quickly the skin returns to its normal position can help assess hydration. Slow return may indicate dehydration
4. mouth and lips: a dry mouth or cracked lips can be early signs of dehydration
5. headaches and fatigue: frequent headaches, dizziness, or unexplained fatigue can be signs of inadequate hydration
6. sweating: in physically active people, monitoring how much they sweat during activity can help estimate fluid loss and hydration needs. Higher levels of sweat may predispose a person to dehydration if they are unable to replace the fluid lost through water intake
These indicators, used together, provide a more comprehensive picture of hydration without solely depending on the sensation of thirst.
Of course, if you do feel thirsty, it’s still a good idea to drink some water.
Lauren Ball, Professor of Community Health and Wellbeing, The University of Queensland and Kiara Too, PhD candidate, School of Human Movement and Nutrition Sciences, The University of Queensland
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Don’t Forget…
Did you arrive here from our newsletter? Don’t forget to return to the email to continue learning!
Learn to Age Gracefully
Join the 98k+ American women taking control of their health & aging with our 100% free (and fun!) daily emails:
-
Which Comes First, Cardio or Weights? – by Alex Hutchinson
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
This is a book of questions and answers, myths and busts, and in short, all things exercise.
It’s laid out as many micro-chapters with questions as headers. The explanations are clear and easy to understand, with several citations (of studies and other academic papers) per question.
While it’s quite comprehensive (weighing in at a hefty 300+ pages), it’s not the kind of book where one could just look up any given piece of information that one wants.
Its strength, rather, lies in pre-emptively arming the reader with knowledge, and correcting many commonly-believed myths. It can be read cover-to-cover, or just dipped into per what interests you (the table of contents lists all questions, so it’s easy to flip through).
Bottom line: if you’ve found the world of exercise a little confusing and would like it demystifying, this book will result in a lot of “Oooooh” moments.
Click here to check out Which Comes First, Cardio or Weights?, and know your stuff!
PS: the short answer to the titular question is “mix it up and keep it varied”
Don’t Forget…
Did you arrive here from our newsletter? Don’t forget to return to the email to continue learning!
Learn to Age Gracefully
Join the 98k+ American women taking control of their health & aging with our 100% free (and fun!) daily emails:
-
Proteins Of The Week
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
This week’s news round-up is, entirely by chance, somewhat protein-centric in one form or another. So, check out the bad, the very bad, the mostly good, the inconvenient, and the worst:
Mediterranean diet vs the menopause
Researchers looked at hundreds of women with an average age of 51, and took note of their dietary habits vs their menopause symptoms. Most of them were consuming soft drinks and red meat, and not good in terms of meeting the recommendations for key food groups including vegetables, legumes, fruit, fish and nuts, and there was an association between greater adherence to Mediterranean diet principles, and better health.
Read in full: Fewer soft drinks and less red meat may ease menopause symptoms: Study
Related: Four Ways To Upgrade The Mediterranean Diet
Listeria in meat
This one’s not a study, but it is relevant important news. The headline pretty much says it all, so if you don’t eat meat, this isn’t one you need to worry about any further than that. If you do eat meat, though, you might want to check out the below article to find out whether the meat you eat might be carrying listeria:
Read in full: Almost 10 million pounds of meat recalled due to Listeria danger
Related: Frozen/Thawed/Refrozen Meat: How Much Is Safety, And How Much Is Taste?
Brawn and brain?
A study looked at cognitively healthy older adults (of whom, 57% women), and found an association between their muscle strength and their psychological wellbeing. Note that when we said “cognitively healthy”, this means being free from dementia etc—not necessarily psychologically health in all respects, such as also being free from depression and enjoying good self-esteem.
Read in full: Study links muscle strength and mental health in older adults
Related: Staying Strong: Tips To Prevent Muscle Loss With Age
The protein that blocks bone formation
This one’s more clinical but definitely of interest to any with osteoporosis or at high risk of osteoporosis. Researchers identified a specific protein that blocks osteoblast function, thus more of this protein means less bone production. Currently, this is not something that we as individuals can do anything about at home, but it is promising for future osteoporosis meds development.
Read in full: Protein blocking bone development could hold clues for future osteoporosis treatment
Related: Which Osteoporosis Medication, If Any, Is Right For You?
Rabies risk
People associate rabies with “rabid dogs”, but the biggest rabies threat is actually bats, and they don’t even need to necessarily bite you to confer the disease (it suffices to have licked the skin, for instance—and bats are basically sky-puppies who will lick anything). Because rabies has a 100% fatality rate in unvaccinated humans, this is very serious. This means that if you wake up and there’s a bat in the house, it doesn’t matter if it hasn’t bitten anyone; get thee to a hospital (where you can get the vaccine before the disease takes hold; this will still be very unpleasant but you’ll probably survive so long as you get the vaccine in time).
Read in full: What to know about bats and rabies
Related: Dodging Dengue In The US ← much less serious than rabies, but still not to be trifled with—particularly noteworthy if you’re in an area currently affected by floodwaters or even just unusually heavy rain, by the way, as this will leave standing water in which mosquitos breed.
Take care!
Don’t Forget…
Did you arrive here from our newsletter? Don’t forget to return to the email to continue learning!
Learn to Age Gracefully
Join the 98k+ American women taking control of their health & aging with our 100% free (and fun!) daily emails: