Slowing the Progression of Cataracts
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
Understanding Cataracts
Cataracts are natural and impact everyone.
That’s a bit of a daunting opening line, but as Dr. Michele Lee, a board-certified ophthalmologist, explains, cataracts naturally develop with age, and can be accelerated by factors such as trauma, certain medications, and specific eye conditions.
We know how important your vision is to you (we’ve had great feedback about the book Vision for Life) as well as our articles on how glasses impact your eyesight and the effects of using eye drops.
While complete prevention isn’t possible, steps such as those mentioned below can be taken to slow their progression.
Here is an overview of the video’s first 3 takeaways. You can watch the whole video below.
Protect Your Eyes from Sunlight
Simply put, UV light damages lens proteins, which (significantly) contributes to cataracts. Wearing sunglasses can supposedly prevent up to 20% of cataracts caused by UV exposure.
Moderate Alcohol Consumption
We all, at some level, know that alcohol consumption doesn’t do us any good. Your eye health isn’t an exception to the rule; alcohol has been shown to contribute to cataract development.
If you’re looking at reducing your alcohol use, try reading this guide on lowering, or eradicating, alcohol consumption.
Avoid Smoking
Smokers are 2-3 times more likely to develop cataracts. Additionally, ensure good ventilation while cooking to avoid exposure to harmful indoor smoke.
See all 5 steps in the below video:
How was the video? If you’ve discovered any great videos yourself that you’d like to share with fellow 10almonds readers, then please do email them to us!
Don’t Forget…
Did you arrive here from our newsletter? Don’t forget to return to the email to continue learning!
Recommended
Learn to Age Gracefully
Join the 98k+ American women taking control of their health & aging with our 100% free (and fun!) daily emails:
How do science journalists decide whether a psychology study is worth covering?
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
Complex research papers and data flood academic journals daily, and science journalists play a pivotal role in disseminating that information to the public. This can be a daunting task, requiring a keen understanding of the subject matter and the ability to translate dense academic language into narratives that resonate with the general public.
Several resources and tip sheets, including the Know Your Research section here at The Journalist’s Resource, aim to help journalists hone their skills in reporting on academic research.
But what factors do science journalists look for to decide whether a social science research study is trustworthy and newsworthy? That’s the question researchers at the University of California, Davis, and the University of Melbourne in Australia examine in a recent study, “How Do Science Journalists Evaluate Psychology Research?” published in September in Advances in Methods and Practices in Psychological Science.
Their online survey of 181 mostly U.S.-based science journalists looked at how and whether they were influenced by four factors in fictitious research summaries: the sample size (number of participants in the study), sample representativeness (whether the participants in the study were from a convenience sample or a more representative sample), the statistical significance level of the result (just barely statistically significant or well below the significance threshold), and the prestige of a researcher’s university.
The researchers found that sample size was the only factor that had a robust influence on journalists’ ratings of how trustworthy and newsworthy a study finding was.
University prestige had no effect, while the effects of sample representativeness and statistical significance were inconclusive.
But there’s nuance to the findings, the authors note.
“I don’t want people to think that science journalists aren’t paying attention to other things, and are only paying attention to sample size,” says Julia Bottesini, an independent researcher, a recent Ph.D. graduate from the Psychology Department at UC Davis, and the first author of the study.
Overall, the results show that “these journalists are doing a very decent job” vetting research findings, Bottesini says.
Also, the findings from the study are not generalizable to all science journalists or other fields of research, the authors note.
“Instead, our conclusions should be circumscribed to U.S.-based science journalists who are at least somewhat familiar with the statistical and replication challenges facing science,” they write. (Over the past decade a series of projects have found that the results of many studies in psychology and other fields can’t be reproduced, leading to what has been called a ‘replication crisis.’)
“This [study] is just one tiny brick in the wall and I hope other people get excited about this topic and do more research on it,” Bottesini says.
More on the study’s findings
The study’s findings can be useful for researchers who want to better understand how science journalists read their research and what kind of intervention — such as teaching journalists about statistics — can help journalists better understand research papers.
“As an academic, I take away the idea that journalists are a great population to try to study because they’re doing something really important and it’s important to know more about what they’re doing,” says Ellen Peters, director of Center for Science Communication Research at the School of Journalism and Communication at the University of Oregon. Peters, who was not involved in the study, is also a psychologist who studies human judgment and decision-making.
Peters says the study was “overall terrific.” She adds that understanding how journalists do their work “is an incredibly important thing to do because journalists are who reach the majority of the U.S. with science news, so understanding how they’re reading some of our scientific studies and then choosing whether to write about them or not is important.”
The study, conducted between December 2020 and March 2021, is based on an online survey of journalists who said they at least sometimes covered science or other topics related to health, medicine, psychology, social sciences, or well-being. They were offered a $25 Amazon gift card as compensation.
Among the participants, 77% were women, 19% were men, 3% were nonbinary and 1% preferred not to say. About 62% said they had studied physical or natural sciences at the undergraduate level, and 24% at the graduate level. Also, 48% reported having a journalism degree. The study did not include the journalists’ news reporting experience level.
Participants were recruited through the professional network of Christie Aschwanden, an independent journalist and consultant on the study, which could be a source of bias, the authors note.
“Although the size of the sample we obtained (N = 181) suggests we were able to collect a range of perspectives, we suspect this sample is biased by an ‘Aschwanden effect’: that science journalists in the same professional network as C. Aschwanden will be more familiar with issues related to the replication crisis in psychology and subsequent methodological reform, a topic C. Aschwanden has covered extensively in her work,” they write.
Participants were randomly presented with eight of 22 one-paragraph fictitious social and personality psychology research summaries with fictitious authors. The summaries are posted on Open Science Framework, a free and open-source project management tool for researchers by the Center for Open Science, with a mission to increase openness, integrity and reproducibility of research.
For instance, one of the vignettes reads:
“Scientists at Harvard University announced today the results of a study exploring whether introspection can improve cooperation. 550 undergraduates at the university were randomly assigned to either do a breathing exercise or reflect on a series of questions designed to promote introspective thoughts for 5 minutes. Participants then engaged in a cooperative decision-making game, where cooperation resulted in better outcomes. People who spent time on introspection performed significantly better at these cooperative games (t (548) = 3.21, p = 0.001). ‘Introspection seems to promote better cooperation between people,’ says Dr. Quinn, the lead author on the paper.”
In addition to answering multiple-choice survey questions, participants were given the opportunity to answer open-ended questions, such as “What characteristics do you [typically] consider when evaluating the trustworthiness of a scientific finding?”
Bottesini says those responses illuminated how science journalists analyze a research study. Participants often mentioned the prestige of the journal in which it was published or whether the study had been peer-reviewed. Many also seemed to value experimental research designs over observational studies.
Considering statistical significance
When it came to considering p-values, “some answers suggested that journalists do take statistical significance into account, but only very few included explanations that suggested they made any distinction between higher or lower p values; instead, most mentions of p values suggest journalists focused on whether the key result was statistically significant,” the authors write.
Also, many participants mentioned that it was very important to talk to outside experts or researchers in the same field to get a better understanding of the finding and whether it could be trusted, the authors write.
“Journalists also expressed that it was important to understand who funded the study and whether the researchers or funders had any conflicts of interest,” they write.
Participants also “indicated that making claims that were calibrated to the evidence was also important and expressed misgivings about studies for which the conclusions do not follow from the evidence,” the authors write.
In response to the open-ended question, “What characteristics do you [typically] consider when evaluating the trustworthiness of a scientific finding?” some journalists wrote they checked whether the study was overstating conclusions or claims. Below are some of their written responses:
- “Is the researcher adamant that this study of 40 college kids is representative? If so, that’s a red flag.”
- “Whether authors make sweeping generalizations based on the study or take a more measured approach to sharing and promoting it.”
- “Another major point for me is how ‘certain’ the scientists appear to be when commenting on their findings. If a researcher makes claims which I consider to be over-the-top about the validity or impact of their findings, I often won’t cover.”
- “I also look at the difference between what an experiment actually shows versus the conclusion researchers draw from it — if there’s a big gap, that’s a huge red flag.”
Peters says the study’s findings show that “not only are journalists smart, but they have also gone out of their way to get educated about things that should matter.”
What other research shows about science journalists
A 2023 study, published in the International Journal of Communication, based on an online survey of 82 U.S. science journalists, aims to understand what they know and think about open-access research, including peer-reviewed journals and articles that don’t have a paywall, and preprints. Data was collected between October 2021 and February 2022. Preprints are scientific studies that have yet to be peer-reviewed and are shared on open repositories such as medRxiv and bioRxiv. The study finds that its respondents “are aware of OA and related issues and make conscious decisions around which OA scholarly articles they use as sources.”
A 2021 study, published in the Journal of Science Communication, looks at the impact of the COVID-19 pandemic on the work of science journalists. Based on an online survey of 633 science journalists from 77 countries, it finds that the pandemic somewhat brought scientists and science journalists closer together. “For most respondents, scientists were more available and more talkative,” the authors write. The pandemic has also provided an opportunity to explain the scientific process to the public, and remind them that “science is not a finished enterprise,” the authors write.
More than a decade ago, a 2008 study, published in PLOS Medicine, and based on an analysis of 500 health news stories, found that “journalists usually fail to discuss costs, the quality of the evidence, the existence of alternative options, and the absolute magnitude of potential benefits and harms,” when reporting on research studies. Giving time to journalists to research and understand the studies, giving them space for publication and broadcasting of the stories, and training them in understanding academic research are some of the solutions to fill the gaps, writes Gary Schwitzer, the study author.
Advice for journalists
We asked Bottesini, Peters, Aschwanden and Tamar Wilner, a postdoctoral fellow at the University of Texas, who was not involved in the study, to share advice for journalists who cover research studies. Wilner is conducting a study on how journalism research informs the practice of journalism. Here are their tips:
1. Examine the study before reporting it.
Does the study claim match the evidence? “One thing that makes me trust the paper more is if their interpretation of the findings is very calibrated to the kind of evidence that they have,” says Bottesini. In other words, if the study makes a claim in its results that’s far-fetched, the authors should present a lot of evidence to back that claim.
Not all surprising results are newsworthy. If you come across a surprising finding from a single study, Peters advises you to step back and remember Carl Sagan’s quote: “Extraordinary claims require extraordinary evidence.”
How transparent are the authors about their data? For instance, are the authors posting information such as their data and the computer codes they use to analyze the data on platforms such as Open Science Framework, AsPredicted, or The Dataverse Project? Some researchers ‘preregister’ their studies, which means they share how they’re planning to analyze the data before they see them. “Transparency doesn’t automatically mean that a study is trustworthy,” but it gives others the chance to double-check the findings, Bottesini says.
Look at the study design. Is it an experimental study or an observational study? Observational studies can show correlations but not causation.
“Observational studies can be very important for suggesting hypotheses and pointing us towards relationships and associations,” Aschwanden says.
Experimental studies can provide stronger evidence toward a cause, but journalists must still be cautious when reporting the results, she advises. “If we end up implying causality, then once it’s published and people see it, it can really take hold,” she says.
Know the difference between preprints and peer-reviewed, published studies. Peer-reviewed papers tend to be of higher quality than those that are not peer-reviewed. Read our tip sheet on the difference between preprints and journal articles.
Beware of predatory journals. Predatory journals are journals that “claim to be legitimate scholarly journals, but misrepresent their publishing practices,” according to a 2020 journal article, published in the journal Toxicologic Pathology, “Predatory Journals: What They Are and How to Avoid Them.”
2. Zoom in on data.
Read the methods section of the study. The methods section of the study usually appears after the introduction and background section. “To me, the methods section is almost the most important part of any scientific paper,” says Aschwanden. “It’s amazing to me how often you read the design and the methods section, and anyone can see that it’s a flawed design. So just giving things a gut-level check can be really important.”
What’s the sample size? Not all good studies have large numbers of participants but pay attention to the claims a study makes with a small sample size. “If you have a small sample, you calibrate your claims to the things you can tell about those people and don’t make big claims based on a little bit of evidence,” says Bottesini.
But also remember that factors such as sample size and p-value are not “as clear cut as some journalists might assume,” says Wilner.
How representative of a population is the study sample? “If the study has a non-representative sample of, say, undergraduate students, and they’re making claims about the general population, that’s kind of a red flag,” says Bottesini. Aschwanden points to the acronym WEIRD, which stands for “Western, Educated, Industrialized, Rich, and Democratic,” and is used to highlight a lack of diversity in a sample. Studies based on such samples may not be generalizable to the entire population, she says.
Look at the p-value. Statistical significance is both confusing and controversial, but it’s important to consider. Read our tip sheet, “5 Things Journalists Need to Know About Statistical Significance,” to better understand it.
3. Talk to scientists not involved in the study.
If you’re not sure about the quality of a study, ask for help. “Talk to someone who is an expert in study design or statistics to make sure that [the study authors] use the appropriate statistics and that methods they use are appropriate because it’s amazing to me how often they’re not,” says Aschwanden.
Get an opinion from an outside expert. It’s always a good idea to present the study to other researchers in the field, who have no conflicts of interest and are not involved in the research you’re covering and get their opinion. “Don’t take scientists at their word. Look into it. Ask other scientists, preferably the ones who don’t have a conflict of interest with the research,” says Bottesini.
4. Remember that a single study is simply one piece of a growing body of evidence.
“I have a general rule that a single study doesn’t tell us very much; it just gives us proof of concept,” says Peters. “It gives us interesting ideas. It should be retested. We need an accumulation of evidence.”
Aschwanden says as a practice, she tries to avoid reporting stories about individual studies, with some exceptions such as very large, randomized controlled studies that have been underway for a long time and have a large number of participants. “I don’t want to say you never want to write a single-study story, but it always needs to be placed in the context of the rest of the evidence that we have available,” she says.
Wilner advises journalists to spend some time looking at the scope of research on the study’s specific topic and learn how it has been written about and studied up to that point.
“We would want science journalists to be reporting balance of evidence, and not focusing unduly on the findings that are just in front of them in a most recent study,” Wilner says. “And that’s a very difficult thing to as journalists to do because they’re being asked to make their article very newsy, so it’s a difficult balancing act, but we can try and push journalists to do more of that.”
5. Remind readers that science is always changing.
“Science is always two steps forward, one step back,” says Peters. Give the public a notion of uncertainty, she advises. “This is what we know today. It may change tomorrow, but this is the best science that we know of today.”
Aschwanden echoes the sentiment. “All scientific results are provisional, and we need to keep that in mind,” she says. “It doesn’t mean that we can’t know anything, but it’s very important that we don’t overstate things.”
Authors of a study published in PNAS in January analyzed more than 14,000 psychology papers and found that replication success rates differ widely by psychology subfields. That study also found that papers that could not be replicated received more initial press coverage than those that could.
The authors note that the media “plays a significant role in creating the public’s image of science and democratizing knowledge, but it is often incentivized to report on counterintuitive and eye-catching results.”
Ideally, the news media would have a positive relationship with replication success rates in psychology, the authors of the PNAS study write. “Contrary to this ideal, however, we found a negative association between media coverage of a paper and the paper’s likelihood of replication success,” they write. “Therefore, deciding a paper’s merit based on its media coverage is unwise. It would be valuable for the media to remind the audience that new and novel scientific results are only food for thought before future replication confirms their robustness.”
Additional reading
Uncovering the Research Behaviors of Reporters: A Conceptual Framework for Information Literacy in Journalism
Katerine E. Boss, et al. Journalism & Mass Communication Educator, October 2022.The Problem with Psychological Research in the Media
Steven Stosny. Psychology Today, September 2022.Critically Evaluating Claims
Megha Satyanarayana, The Open Notebook, January 2022.How Should Journalists Report a Scientific Study?
Charles Binkley and Subramaniam Vincent. Markkula Center for Applied Ethics at Santa Clara University, September 2020.What Journalists Get Wrong About Social Science: Full Responses
Brian Resnick. Vox, January 2016.From The Journalist’s Resource
8 Ways Journalists Can Access Academic Research for Free
5 Things Journalists Need to Know About Statistical Significance
5 Common Research Designs: A Quick Primer for Journalists
5 Tips for Using PubPeer to Investigate Scientific Research Errors and Misconduct
What’s Standard Deviation? 4 Things Journalists Need to Know
This article first appeared on The Journalist’s Resource and is republished here under a Creative Commons license.
Share This Post
Vit D + Calcium: Too Much Of A Good Thing?
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
Vit D + Calcium: Too Much Of A Good Thing?
- Myth: you can’t get too much calcium!
- Myth: you must get as much vitamin D as possible!
Let’s tackle calcium first:
❝Calcium is good for you! You need more calcium for your bones! Be careful you don’t get calcium-deficient!❞
Contingently, those comments seem reasonable. Contingently on you not already having the right amount of calcium. Most people know what happens in the case of too little calcium: brittle bones, osteoporosis, and so forth.
But what about too much?
Hypercalcemia
Having too much calcium—or “hypercalcemia”— can lead to problems with…
- Groans: gastrointestinal pain, nausea, and vomiting. Peptic ulcer disease and pancreatitis.
- Bones: bone-related pains. Osteoporosis, osteomalacia, arthritis and pathological fractures.
- Stones: kidney stones causing pain.
- Moans: refers to fatigue and malaise.
- Thrones: polyuria, polydipsia, and constipation
- Psychic overtones: lethargy, confusion, depression, and memory loss.
(mnemonic courtesy of Sadiq et al, 2022)
What causes this, and how do we avoid it? Is it just dietary?
It’s mostly not dietary!
Overconsumption of calcium is certainly possible, but not common unless one has an extreme diet and/or over-supplementation. However…
Too much vitamin D
Again with “too much of a good thing”! While keeping good levels of vitamin D is, obviously, good, overdoing it (including commonly prescribed super-therapeutic doses of vitamin D) can lead to hypercalcemia.
This happens because vitamin D triggers calcium absorption into the gut, and acts as gatekeeper to the bloodstream.
Normally, the body only absorbs 10–20% of the calcium we consume, and that’s all well and good. But with overly high vitamin D levels, the other 80–90% can be waved on through, and that is very much Not Good™.
See for yourself:
- Hypercalcemia of Malignancy: An Update on Pathogenesis and Management
- Vitamin D-Mediated Hypercalcemia: Mechanisms, Diagnosis, and Treatment
How much is too much?
The United States’ Office of Dietary Supplements defines 4000 IU (100μg) as a high daily dose of vitamin D, and recommends 600 IU (15μg) as a daily dose, or 800 IU (20μg) if aged over 70.
See for yourself: Vitamin D Fact Sheet for Health Professionals ← there’s quite a bit of extra info there too
Share This Post
Cognitive Enhancement Without Drugs
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
Cognitive Enhancement Without Drugs
This is Elizabeth Ricker. She’s a Harvard-and-MIT-trained neuroscientist and researcher, who now runs the “Citizen Science” DIY-neurohacking organization, NeuroEducate.
Sounds fun! What’s it about?
The philosophy that spurs on her research and practice can be summed up as follows:
❝I’m not going to leave my brain up to my doctor or [anyone else]… My brain is my own responsibility, and I’m going to do the best that I can to optimize it❞
Her goal is not just to optimize her own brain though; she wants to make the science accessible to everyone.
What’s this about Citizen Science?
“Citizen Science” is the idea that while there’s definitely an important role in society for career academics, science itself should be accessible to all. And, not just the conclusions, but the process too.
This can take the form of huge experiments, often facilitated these days by apps where we opt-in to allow our health metrics (for example) to be collated with many thousands of others, for science. It can also involve such things as we talked about recently, getting our own raw genetic data and “running the numbers” at home to get far more comprehensive and direct information than the genetic testing company would ever provide us.
For Ricker, her focus is on the neuroscience side of biohacking, thus, neurohacking.
I’m ready to hack my brain! Do I need a drill?
Happily not! Although… Bone drills for the skull are very convenient instruments that make it quite hard to go wrong even with minimal training. The drill bit has a little step/ledge partway down, which means you can only drill through the thickness of the skull itself, before the bone meeting the wider part of the bit stops you from accidentally drilling into the brain. Still, please don’t do this at home.
What you can do at home is a different kind of self-experimentation…
If you want to consider which things are genuinely resulting in cognitive enhancement and which things are not, you need to approach the matter like a scientist. That means going about it in an organized fashion, and recording results.
There are several ways cognitive enhancement can be measured, including:
- Learning and memory
- Executive function
- Emotional regulation
- Creative intelligence
Let’s look at each of them, and what can be done. We don’t have a lot of room here; we’re a newsletter not a book, but we’ll cover one of Ricker’s approaches for each:
Learning and memory
This one’s easy. We’re going to leverage neuroplasticity (neurons that fire together, wire together!) by simple practice, and introduce an extra element to go alongside your recall. Perhaps a scent, or a certain item of clothing. Tell yourself that clinical studies have shown that this will boost your recall. It’s true, but that’s not what’s important; what’s important is that you believe it, and bring the placebo effect to bear on your endeavors.
You can test your memory with word lists, generated randomly by AI, such as this one:
You’ll soon find your memory improving—but don’t take our word for it!
Executive function
Executive function is the aspect of your brain that tells the other parts how to work, when to work, and when to stop working. If you’ve ever spent 30 minutes thinking “I need to get up” but you were stuck in scrolling social media, that was executive dysfunction.
This can be trained using the Stroop Color and Word Test, which shows you words, specifically the names of colors, which will themselves be colored, but not necessarily in the color the word pertains to. So for example, you might be shown the word “red”, colored green. Your task is to declare either the color of the word only, ignoring the word itself, or the meaning of the word only, ignoring its appearance. It can be quite challenging, but you’ll get better quite quickly:
The Stroop Test: Online Version
Emotional Regulation
This is the ability to not blow up angrily at the person with whom you need to be diplomatic, or to refrain from laughing when you thought of something funny in a sombre situation.
It’s an important part of cognitive function, and success or failure can have quite far-reaching consequences in life. And, it can be trained too.
There’s no online widget for this one, but: when and if you’re in a position to safely* do so, think about something that normally triggers a strong unwanted emotional reaction. It doesn’t have to be something life-shattering, but just something that you feel in some way bad about. Hold this in your mind, sit with it, and practice mindfulness. The idea is to be able to hold the unpleasant idea in your mind, without becoming reactive to it, or escaping to more pleasant distractions. Build this up.
*if you perchance have PTSD, C-PTSD, or an emotional regulation disorder, you might want to talk this one through with a qualified professional first.
Creative Intelligence
Another important cognitive skill, and again, one that can be cultivated and grown.
The trick here is volume. A good, repeatable test is to think of a common object (e.g. a rock, a towel, a banana) and, within a time constraint (such as 15 minutes) list how many uses you can think of for that item.
Writer’s storytime: once upon a time, I was sorting through an inventory of medical equipment with a colleague, and suggested throwing out our old arterial clamps, as we had newer, better ones—in abundance. My colleague didn’t want to part with them, so I challenged him “Give me one use for these, something we could in some possible world use them for that the new clamps don’t do better, and we’ll keep them”. He said “Thumbscrews”, and I threw my hands up in defeat, saying “Fine!”, as he had technically fulfilled my condition.
What’s the hack to improve this one? Just more volume. Creativity, as it turns out, isn’t something we can expend—like a muscle, it grows the more we use it. And because the above test is repeatable (with different objects), you can track your progress.
And if you feel like using your grown creative muscle to write/paint/compose/etc your magnum opus, great! Or if you just want to apply it to the problem-solving of everyday life, also great!
In summary…
Our brain is a wonderful organ with many functions. Society expects us to lose these as we get older, but the simple, scientific truth is that we can not only maintain our cognitive function, but also enhance and grow it as we go.
Want to know more from today’s featured expert?
You might enjoy her book, “Smarter Tomorrow”, which we reviewed back in March
Share This Post
Related Posts
More Salt, Not Less?
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
It’s Q&A Day at 10almonds!
Have a question or a request? We love to hear from you!
In cases where we’ve already covered something, we might link to what we wrote before, but will always be happy to revisit any of our topics again in the future too—there’s always more to say!
As ever: if the question/request can be answered briefly, we’ll do it here in our Q&A Thursday edition. If not, we’ll make a main feature of it shortly afterwards!
So, no question/request too big or small
❝I’m curious about the salt part – learning about LMNT and what they say about us needing more salt than what’s recommended by the government, would you mind looking into that? From a personal experience, I definitely noticed a massive positive difference during my 3-5 day water fasts when I added salt to my water compared to when I just drank water. So I’m curious what the actual range for salt intake is that we should be aiming for.❞
That’s a fascinating question, and we’ll have to tackle it in several parts:
When fasting
3–5 days is a long time to take only water; we’re sure you know most people fast from food for much less time than that. Nevertheless, when fasting, the body needs more water than usual—because of the increase in metabolism due to freeing up bodily resources for cellular maintenance. Water is necessary when replacing cells (most of which are mostly water, by mass), and for ferrying nutrients around the body—as well as escorting unwanted substances out of the body.
Normally, the body’s natural osmoregulatory process handles this, balancing water with salts of various kinds, to maintain homeostasis.
However, it can only do that if it has the requisite parts (e.g. water and salts), and if you’re fasting from food, you’re not replenishing lost salts unless you supplement.
Normally, monitoring our salt intake can be a bit of a guessing game, but when fasting for an entire day, it’s clear how much salt we consumed in our food that day: zero
So, taking the recommended amount of sodium, which varies but is usually in the 1200–1500mg range (low end if over aged 70+; high end if aged under 50), becomes sensible.
More detail: How Much Sodium You Need Per Day
See also, on a related note:
When To Take Electrolytes (And When We Shouldn’t!)
When not fasting
Our readers here are probably not “the average person” (since we have a very health-conscious subscriber-base), but the average person in N. America consumes about 9g of salt per day, which is several multiples of the maximum recommended safe amount.
The WHO recommends no more than 5g per day, and the AHA recommends no more than 2.3g per day, and that we should aim for 1.5g per day (this is, you’ll note, consistent with the previous “1200–1500mg range”).
Read more: Massive efforts needed to reduce salt intake and protect lives
Questionable claims
We can’t speak for LMNT (and indeed, had to look them up to discover they are an electrolytes supplement brand), but we can say that sometimes there are articles about such things as “The doctor who says we should eat more salt, not less”, and that’s usually about Dr. James DiNicolantonio, a doctor of pharmacy, who wrote a book that, because of this question today, we’ve now also reviewed:
Spoiler, our review was not favorable.
The body knows
Our kidneys (unless they are diseased or missing) do a full-time job of getting rid of excess things from our blood, and dumping them into one’s urine.
That includes excess sugar (which is how diabetes was originally diagnosed) and excess salt. In both cases, they can only process so much, but they do their best.
Dr. DiNicolantino recognizes this in his book, but chalks it up to “if we do take too much salt, we’ll just pass it in urine, so no big deal”.
Unfortunately, this assumes that our kidneys have infinite operating capacity, and they’re good, but they’re not that good. They can only filter so much per hour (it’s about 1 liter of fluids). Remember we have about 5 liters of blood, consume 2–3 liters of water per day, and depending on our diet, several more liters of water in food (easy to consume several more liters of water in food if one eats fruit, let alone soups and stews etc), and when things arrive in our body, the body gets to work on them right away, because it doesn’t know how much time it’s going to have to get it done, before the next intake comes.
It is reasonable to believe that if we needed 8–10g of salt per day, as Dr. DiNicolantonio claims, our kidneys would not start dumping once we hit much, much lower levels in our blood (lower even than the daily recommended intake, because not all of the salt in our body is in our blood, obviously).
See also: How Too Much Salt Can Lead To Organ Failure
Lastly, a note about high blood pressure
This is one where the “salt’s not the bad guy” crowd have at least something close to a point, because while salt is indeed still a bad guy (if taken above the recommended amounts, without good medical reason), when it comes to high blood pressure specifically, it’s not the worst bad guy, nor is it even in the top 5:
Hypertension: Factors Far More Relevant Than Salt
Thanks for writing in with such an interesting question!
Don’t Forget…
Did you arrive here from our newsletter? Don’t forget to return to the email to continue learning!
Learn to Age Gracefully
Join the 98k+ American women taking control of their health & aging with our 100% free (and fun!) daily emails:
What Are Nootropics, Really?
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
What are nootropics, really?
A nootropic is anything that functions as a cognitive enhancer—in other words, improves our brainpower.
These can be sensationalized as “smart drugs”, misrepresented excitingly in science fiction, meme-ified in the mundane (“but first, coffee”), and reframed entirely, (“exercise is the best nootropic”).
So, clearly, “nootropics” can mean a lot of different things. Let’s look at some of the main categories…
The neurochemical modulators
These are what often get called “smart drugs”. They are literally drugs (have a chemical effect on the body that isn’t found in our diet), and they affect the levels of certain neurotransmitters in the brain, such as by:
- Adding more of that neurotransmitter (simple enough)
- Decreasing the rate at which we lose that neurotransmitter (re-uptake inhibitors)
- Antagonizing an unhelpful neurotransmitter (doing the opposite thing to it)
- Blocking an unhelpful neurotransmitter (stopping the receptors from receiving it)
“Unhelpful” here is relative and subjective, of course. We need all the neurotransmitters that are in our brain, after all, we just don’t need all of them all the time.
Examples: modafinil, a dopamine re-uptake inhibitor (mostly prescribed for sleep disorders), reduces the rate at which our brains scrub dopamine, resulting in a gradual build-up of dopamine that we naturally produced, so we get to enjoy that dopamine for longer. This will tend to promote wakefulness, and may also help with problem-solving and language faculties—as well as giving a mood boost. In other words, all things that dopamine is used for. Mirtazaрine, an adrenoreceptor agonist (mostly prescribed as an antidepressant), increases noradrenergic neurotransmission, thus giving many other brain functions a boost.
Why it works: our brains need healthy levels of neurotransmitters, in order to function well. Those levels are normally self-regulating, but can become depleted in times of stress or fatigue, for example.
The metabolic brain boosters
These are the kind of things that get included in nootropic stacks (stack = a collection of supplements and/or drugs that complement each other and are taken together—for example, a multivitamin tablet could be described as a vitamin stack) even though they have nothing specifically relating them to brain function. Why are they included?
The brain needs so much fuel. Metabolically speaking, it’s a gas-guzzler. It’s the single most resource-intensive organ of our body, by far. So, metabolic brain boosters tend to:
- Increase blood flow
- Increase blood oxygenation
- Increase blood general health
- Improve blood pressure (this is relative and subjective, since very obviously there’s a sweet spot)
Examples: B-vitamins. Yep, it can be that simple. A less obvious example might be Co-enzyme Q10, which supports energy production on a cellular level, and good cardiovascular health.
Why it works: you can’t have a healthy brain without a healthy heart!
We are such stuff as brains are made of
Our brains are made of mostly fat, water, and protein. But, not just any old fat and protein—we’re at least a little bit special! So, brain-food foods tend to:
- Give the brain the fats and proteins it’s made of
- Give the brain the stuff to make the fats and proteins it’s made of (simpler fats, and amino acids)
- Give the brain hydration! Just having water, and electrolytes as appropriate, does this
Examples: healthy fats from nuts, seeds, and seafood; also, a lot of phytonutrients from greens and certain fruits. Long-time subscribers may remember our article “Brain Food: The Eyes Have It!” on the importance of dietary lutein in reducing Alzheimer’s risk, for example
Why it works: this is matter of structural upkeep and maintenance—our brains don’t work fabulously if deprived of the very stuff they’re made of! Especially hydration is seriously underrated as a nootropic factor, by the way. Most people are dehydrated most of the time, and the brain dehydrates quickly. Fortunately, it rehydrates quickly as well when we take hydrating liquids.
Weird things that sound like ingredients in a witch’s potion
These are too numerous and too varied in how they work to cover here, but they do appear a lot in nootropic stacks and in popular literature on the subject.
Often they work by one of the mechanisms described above; sometimes we’re not entirely sure how they work, and have only measured their effects sufficiently to know that, somehow, they do work.
Examples: panax ginseng is one of the best-studied examples that still remains quite mysterious in many aspects of its mechanism. Lion’s Mane (the mushroom, not the jellyfish or the big cat hairstyle), meanwhile, is known to contain specific compounds that stimulate healthy brain cell growth.
Why it works: as we say, it varies so much from on ingredient to another in this category, so… Watch out for our Research Review Monday features, as we’ll be covering some of these in the coming weeks!
(PS, if there’s any you’d like us to focus on, let us know! We always love to hear from you. You can hit reply to any of our emails, or use the handy feedback widget at the bottom)
Don’t Forget…
Did you arrive here from our newsletter? Don’t forget to return to the email to continue learning!
Learn to Age Gracefully
Join the 98k+ American women taking control of their health & aging with our 100% free (and fun!) daily emails:
Pistachios vs Pine Nuts – Which is Healthier?
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
Our Verdict
When comparing pistachios to pine nuts, we picked the pistachios.
Why?
First looking at the macros, pistachios have nearly 2x the protein while pine nuts have nearly 2x the fat. The fats are healthy in moderation (mostly polyunsaturated, a fair portion of monounsaturated, and a little saturated), but we’re going to value the protein content higher. Also, pistachios have approximately 2x the carbs, and/but nearly 3x the fiber. All in all, we’ll call this section a moderate win for pistachios.
When it comes to vitamins, pistachios have more of vitamins A, B1, B5, B6, B9, and C, while pine nuts have more of vitamins B2, B3, E, K, and choline. All in all, pistachios are scraping a 6:5 win here, or we could call it a tie if we want to value pine nuts’ vitamins more (due to the difference in how many foods each vitamin is found in, and thus the likelihood of having a deficiency or not).
In the category of minerals, pistachios have more calcium, copper, potassium, and selenium, while pine nuts have more iron, magnesium, manganese, and zinc. This would be a tie if we just call it 4:4, but what’s worth noting is that while both of these nuts are a good source of most of the minerals mentioned, pine nuts aren’t a very good source of calcium or selenium, so we’re going to declare this section a very marginal win for pistachios.
Adding up the moderate win, the scraped win, and the barely scraped win, all adds up to a win for pistachios. However, as you might have noticed, both are great so do enjoy both if you can!
Want to learn more?
You might like to read:
Why You Should Diversify Your Nuts
Take care!
Don’t Forget…
Did you arrive here from our newsletter? Don’t forget to return to the email to continue learning!
Learn to Age Gracefully
Join the 98k+ American women taking control of their health & aging with our 100% free (and fun!) daily emails: