Walden Farms Caesar Dressing vs. Primal Kitchen Caesar Dressing – Which is Healthier?
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
Our Verdict
When comparing Walden Farms Caesar Dressing to Primal Kitchen Caesar Dressing, we picked the Primal Kitchen.
Why?
As you can see from the front label, the Walden Farms product has 0 net carbs, 0 calories, and 0 fat. In fact, its ingredients list begins:
Water, white distilled vinegar, erythritol, corn fiber, salt, microcrystalline cellulose, xanthan gum, titanium dioxide (color)
…before it gets to something interesting (garlic purée), by which point the amount must be miniscule.
The Primal Kitchen product, meanwhile, has 140 calories per serving and 15g fat (of which, 1.5g is saturated). However! The ingredients list this time begins:
Avocado oil, water, organic coconut aminos (organic coconut sap, sea salt), organic apple cider vinegar, organic distilled vinegar, mushroom extract, organic gum acacia, organic guar gum
…before it too gets to garlic, which this time, by the way, is organic roasted garlic.
In case you’re wondering about the salt content in both, they add up to 190mg for the Walden Farms product, and 240mg for the Primal Kitchen product. We don’t think that the extra 50mg (out of a daily allowance of 2300–5000mg, depending on whom you ask) is worthy of note.
In short, the Walden Farms product is made of mostly additives of various kinds, whereas the Primal Kitchen product is made of mostly healthful ingredients.
So, the calories and fat are nothing to fear.
For this reason, we chose the product with more healthful ingredients—but we acknowledge that if you are specifically trying to keep your calories down, then the Walden Farms product may be a valid choice.
Read more:
• Can Saturated Fats Be Healthy?
• Caloric Restriction with Optimal Nutrition
Don’t Forget…
Did you arrive here from our newsletter? Don’t forget to return to the email to continue learning!
Recommended
Learn to Age Gracefully
Join the 98k+ American women taking control of their health & aging with our 100% free (and fun!) daily emails:
-
Smarter Tomorrow – by Elizabeth Ricker
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
Based heavily in hard science, with more than 450 citations in over 300 pages, the exhortation is not just “trust me, lol”.
Instead, she encourages the reader to experiment. Not like “try this and see if it works”, but “here’s how to try this, using scientific method with good controls and good record-keeping”.
The book is divided into sections, each with a projection of time required at the start and a summary at the end. The reading style is easy-reading throughout, without sacrificing substance.
It proposes seven key interventions. If just one works for you, it’ll be worth having bought and read the book. More likely most if not all will… Because that’s how science works.
Share This Post
-
Kombucha vs Kimchi – Which is Healthier
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
Our Verdict
When comparing kombucha to kimchi, we picked the kombucha.
Why?
While both are very respectable gut-healthy fermented products,
• the kombucha contains fermented tea, a little apple cider vinegar, and a little fiber
• the kimchi contains (after the vegetables) 810 mg sodium in that little tin, and despite the vegetables, no fiber.You may reasonably be surprised that they managed to take something that is made of mostly vegetables and ended up with no fiber without juicing it, but they did. Fermented vegetables are great for the healthy bacteria benefits (and are tasty too!), but the osmotic pressure due to the salt destroys the cell walls and thus the fiber.
Thus, we chose the kombucha that does the same job without delivering all that salt.
However! If you are comparing kombucha and kimchi out in the wilds of your local supermarket, do still check individual labels. It’s not uncommon, for example, for stores to sell pre-made kombucha that’s loaded with sugar.
About sugar and kombucha…
Sugar is required to make kombucha, to feed the yeast and helpful bacteria. However, there should be none of that sugar left (or only the tiniest trace amount) in the final product, because the yeast (and friends) consumed and metabolized it.
What some store brands do, however, is add in sugar afterwards, as they believe it improves the taste. This writer cannot imagine how, but that is their rationale in any case. Needless to say, it is not a healthy addition, and specifically, it’s bad for your gut, which (healthwise) is the whole point of drinking kombucha in the first place.
Want some? Here is an example product on Amazon, but feel free to shop around as there are many flavors available!
Read more about gut health: Gut Health 101
Share This Post
-
Parents are increasingly saying their child is ‘dysregulated’. What does that actually mean?
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
Welcome aboard the roller coaster of parenthood, where emotions run wild, tantrums reign supreme and love flows deep.
As children reach toddlerhood and beyond, parents adapt to manage their child’s big emotions and meltdowns. Parenting terminology has adapted too, with more parents describing their child as “dysregulated”.
But what does this actually mean?
ShUStudio/Shutterstock More than an emotion
Emotional dysregulation refers to challenges a child faces in recognising and expressing emotions, and managing emotional reactions in social settings.
This may involve either suppressing emotions or displaying exaggerated and intense emotional responses that get in the way of the child doing what they want or need to do.
“Dysregulation” is more than just feeling an emotion. An emotion is a signal, or cue, that can give us important insights to ourselves and our preferences, desires and goals.
An emotionally dysregulated brain is overwhelmed and overloaded (often, with distressing emotions like frustration, disappointment and fear) and is ready to fight, flight or freeze.
Developing emotional regulation
Emotion regulation is a skill that develops across childhood and is influenced by factors such as the child’s temperament and the emotional environment in which they are raised.
In the stage of emotional development where emotion regulation is a primary goal (around 3–5 years old), children begin exploring their surroundings and asserting their desires more actively.
A child’s temperament and upbringing affect how they regulate emotions. bluedog studio/Shutterstock It’s typical for them to experience emotional dysregulation when their initiatives are thwarted or criticised, leading to occasional tantrums or outbursts.
A typically developing child will see these types of outbursts reduce as their cognitive abilities become more sophisticated, usually around the age they start school.
Express, don’t suppress
Expressing emotions in childhood is crucial for social and emotional development. It involves the ability to convey feelings verbally and through facial expressions and body language.
When children struggle with emotional expression, it can manifest in various ways, such as difficulty in being understood, flat facial expressions even in emotionally charged situations, challenges in forming close relationships, and indecisiveness.
Several factors, including anxiety, attention-deficit hyperactivity disorder (ADHD), autism, giftedness, rigidity and both mild and significant trauma experiences, can contribute to these issues.
Common mistakes parents can make is dismissing emotions, or distracting children away from how they feel.
These strategies don’t work and increase feelings of overwhelm. In the long term, they fail to equip children with the skills to identify, express and communicate their emotions, making them vulnerable to future emotional difficulties.
We need to help children move compassionately towards their difficulties, rather than away from them. Parents need to do this for themselves too.
Caregiving and skill modelling
Parents are responsible for creating an emotional climate that facilitates the development of emotion regulation skills.
Parents’ own modelling of emotion regulation when they feel distressed. The way they respond to the expression of emotions in their children, contributes to how children understand and regulate their own emotions.
Children are hardwired to be attuned to their caregivers’ emotions, moods, and coping as this is integral to their survival. In fact, their biggest threat to a child is their caregiver not being OK.
Unsafe, unpredictable, or chaotic home environments rarely give children exposure to healthy emotion expression and regulation. Children who go through maltreatment have a harder time controlling their emotions, needing more brainpower for tasks that involve managing feelings. This struggle could lead to more problems with emotions later on, like feeling anxious and hypervigilant to potential threats.
Recognising and addressing these challenges early on is essential for supporting children’s emotional wellbeing and development.
A dysregulated brain and body
When kids enter “fight or flight” mode, they often struggle to cope or listen to reason. When children experience acute stress, they may respond instinctively without pausing to consider strategies or logic.
If your child is in fight mode, you might observe behaviours such as crying , clenching fists or jaw, kicking, punching, biting, swearing, spitting or screaming.
In flight mode, they may appear restless, have darting eyes, exhibit excessive fidgeting, breathe rapidly, or try to run away.
A shut-down response may look like fainting or a panic attack.
When a child feels threatened, their brain’s frontal lobe, responsible for rational thinking and problem-solving, essentially goes offline.
The amygdala, shown here in red, triggers survival mode. pikovit/Shutterstock This happens when the amygdala, the brain’s alarm system, sends out a false alarm, triggering the survival instinct.
In this state, a child may not be able to access higher functions like reasoning or decision-making.
While our instinct might be to immediately fix the problem, staying present with our child during these moments is more effective. It’s about providing support and understanding until they feel safe enough to engage their higher brain functions again.
Reframe your thinking so you see your child as having a problem – not being the problem.
Tips for parents
Take turns discussing the highs and lows of the day at meal times. This is a chance for you to be curious, acknowledge and label feelings, and model that you, too, experience a range of emotions that require you to put into practice skills to cope and has shown evidence in numerous physical, social-emotional, academic and behavioural benefits.
Talk about your day over dinner. Monkey Business Images/Shutterstock Spending even small amounts (five minutes a day!) of quality one-on-one time with your child is an investment in your child’s emotional wellbeing. Let them pick the activity, do your best to follow their lead, and try to notice and comment on the things they do well, like creative ideas, persevering when things are difficult, and being gentle or kind.
Take a tip from parents of children with neurodiversity: learn about your unique child. Approaching your child’s emotions, temperament, and behaviours with curiosity can help you to help them develop emotion regulation skills.
When to get help
If emotion dysregulation is a persistent issue that is getting in the way of your child feeling happy, calm, or confident – or interfering with learning or important relationships with family members or peers – talk to their GP about engaging with a mental health professional.
Many families have found parenting programs helpful in creating a climate where emotions can be safely expressed and shared.
Remember, you can’t pour from an empty cup. Parenting requires you to be your best self and tend to your needs first to see your child flourish.
Cher McGillivray, Assistant Professor Psychology Department, Bond University and Shawna Mastro Campbell, Assistant Professor Psychology, Bond University
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Share This Post
Related Posts
-
Popcorn vs Peanuts – Which is Healthier
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
Our Verdict
When comparing air-popped popcorn to peanuts (without an allergy), we picked the peanuts.
Why?
Peanuts, if we were to list popular nuts in order of healthfulness, would not be near the top of the list. Many other nuts have more nutrients and fewer/lesser drawbacks.
But the comparison to popcorn shines a different light on it:
Popcorn has very few nutrients. It’s mostly carbs and fiber; it’s just not a lot of carbs because the manner of its consumption makes it a very light snack (literally). You can eat a bowlful and it was perhaps 30g. It has some small amounts of some minerals, but nothing that you could rely on it for. It’s mostly fresh air wrapped in fiber.
Peanuts, in contrast, are a much denser snack. High in calories yes, but also high in protein, their fats are mostly healthy, and they have not only a fair stock of vitamins and minerals, but also a respectable complement of beneficial phytochemicals: mostly assorted antioxidant polyphenols, but also oleic acid (as in olives, good for healthy triglyceride levels).
Another thing worth a mention is their cholesterol-reducing phytosterols (these reduce the absorption of dietary cholesterol, “good” and “bad”, so this is good for most people, bad for some, depending on the state of your cholesterol and what you ate near in time to eating the nuts)
Peanuts do have their clear downsides too: its phytic acid content can reduce the bioavailability of iron and zinc taken at the same time.
In summary: while popcorn’s greatest claim to dietary beneficence is its fiber content and that it’s close to being a “zero snack”, peanuts (eaten in moderation, say, the same 30g as the popcorn) have a lot to contribute to our daily nutritional requirements.
We do suggest enjoying other nuts though!
Read more: Why You Should Diversify Your Nuts!
Don’t Forget…
Did you arrive here from our newsletter? Don’t forget to return to the email to continue learning!
Learn to Age Gracefully
Join the 98k+ American women taking control of their health & aging with our 100% free (and fun!) daily emails:
-
How do science journalists decide whether a psychology study is worth covering?
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
Complex research papers and data flood academic journals daily, and science journalists play a pivotal role in disseminating that information to the public. This can be a daunting task, requiring a keen understanding of the subject matter and the ability to translate dense academic language into narratives that resonate with the general public.
Several resources and tip sheets, including the Know Your Research section here at The Journalist’s Resource, aim to help journalists hone their skills in reporting on academic research.
But what factors do science journalists look for to decide whether a social science research study is trustworthy and newsworthy? That’s the question researchers at the University of California, Davis, and the University of Melbourne in Australia examine in a recent study, “How Do Science Journalists Evaluate Psychology Research?” published in September in Advances in Methods and Practices in Psychological Science.
Their online survey of 181 mostly U.S.-based science journalists looked at how and whether they were influenced by four factors in fictitious research summaries: the sample size (number of participants in the study), sample representativeness (whether the participants in the study were from a convenience sample or a more representative sample), the statistical significance level of the result (just barely statistically significant or well below the significance threshold), and the prestige of a researcher’s university.
The researchers found that sample size was the only factor that had a robust influence on journalists’ ratings of how trustworthy and newsworthy a study finding was.
University prestige had no effect, while the effects of sample representativeness and statistical significance were inconclusive.
But there’s nuance to the findings, the authors note.
“I don’t want people to think that science journalists aren’t paying attention to other things, and are only paying attention to sample size,” says Julia Bottesini, an independent researcher, a recent Ph.D. graduate from the Psychology Department at UC Davis, and the first author of the study.
Overall, the results show that “these journalists are doing a very decent job” vetting research findings, Bottesini says.
Also, the findings from the study are not generalizable to all science journalists or other fields of research, the authors note.
“Instead, our conclusions should be circumscribed to U.S.-based science journalists who are at least somewhat familiar with the statistical and replication challenges facing science,” they write. (Over the past decade a series of projects have found that the results of many studies in psychology and other fields can’t be reproduced, leading to what has been called a ‘replication crisis.’)
“This [study] is just one tiny brick in the wall and I hope other people get excited about this topic and do more research on it,” Bottesini says.
More on the study’s findings
The study’s findings can be useful for researchers who want to better understand how science journalists read their research and what kind of intervention — such as teaching journalists about statistics — can help journalists better understand research papers.
“As an academic, I take away the idea that journalists are a great population to try to study because they’re doing something really important and it’s important to know more about what they’re doing,” says Ellen Peters, director of Center for Science Communication Research at the School of Journalism and Communication at the University of Oregon. Peters, who was not involved in the study, is also a psychologist who studies human judgment and decision-making.
Peters says the study was “overall terrific.” She adds that understanding how journalists do their work “is an incredibly important thing to do because journalists are who reach the majority of the U.S. with science news, so understanding how they’re reading some of our scientific studies and then choosing whether to write about them or not is important.”
The study, conducted between December 2020 and March 2021, is based on an online survey of journalists who said they at least sometimes covered science or other topics related to health, medicine, psychology, social sciences, or well-being. They were offered a $25 Amazon gift card as compensation.
Among the participants, 77% were women, 19% were men, 3% were nonbinary and 1% preferred not to say. About 62% said they had studied physical or natural sciences at the undergraduate level, and 24% at the graduate level. Also, 48% reported having a journalism degree. The study did not include the journalists’ news reporting experience level.
Participants were recruited through the professional network of Christie Aschwanden, an independent journalist and consultant on the study, which could be a source of bias, the authors note.
“Although the size of the sample we obtained (N = 181) suggests we were able to collect a range of perspectives, we suspect this sample is biased by an ‘Aschwanden effect’: that science journalists in the same professional network as C. Aschwanden will be more familiar with issues related to the replication crisis in psychology and subsequent methodological reform, a topic C. Aschwanden has covered extensively in her work,” they write.
Participants were randomly presented with eight of 22 one-paragraph fictitious social and personality psychology research summaries with fictitious authors. The summaries are posted on Open Science Framework, a free and open-source project management tool for researchers by the Center for Open Science, with a mission to increase openness, integrity and reproducibility of research.
For instance, one of the vignettes reads:
“Scientists at Harvard University announced today the results of a study exploring whether introspection can improve cooperation. 550 undergraduates at the university were randomly assigned to either do a breathing exercise or reflect on a series of questions designed to promote introspective thoughts for 5 minutes. Participants then engaged in a cooperative decision-making game, where cooperation resulted in better outcomes. People who spent time on introspection performed significantly better at these cooperative games (t (548) = 3.21, p = 0.001). ‘Introspection seems to promote better cooperation between people,’ says Dr. Quinn, the lead author on the paper.”
In addition to answering multiple-choice survey questions, participants were given the opportunity to answer open-ended questions, such as “What characteristics do you [typically] consider when evaluating the trustworthiness of a scientific finding?”
Bottesini says those responses illuminated how science journalists analyze a research study. Participants often mentioned the prestige of the journal in which it was published or whether the study had been peer-reviewed. Many also seemed to value experimental research designs over observational studies.
Considering statistical significance
When it came to considering p-values, “some answers suggested that journalists do take statistical significance into account, but only very few included explanations that suggested they made any distinction between higher or lower p values; instead, most mentions of p values suggest journalists focused on whether the key result was statistically significant,” the authors write.
Also, many participants mentioned that it was very important to talk to outside experts or researchers in the same field to get a better understanding of the finding and whether it could be trusted, the authors write.
“Journalists also expressed that it was important to understand who funded the study and whether the researchers or funders had any conflicts of interest,” they write.
Participants also “indicated that making claims that were calibrated to the evidence was also important and expressed misgivings about studies for which the conclusions do not follow from the evidence,” the authors write.
In response to the open-ended question, “What characteristics do you [typically] consider when evaluating the trustworthiness of a scientific finding?” some journalists wrote they checked whether the study was overstating conclusions or claims. Below are some of their written responses:
- “Is the researcher adamant that this study of 40 college kids is representative? If so, that’s a red flag.”
- “Whether authors make sweeping generalizations based on the study or take a more measured approach to sharing and promoting it.”
- “Another major point for me is how ‘certain’ the scientists appear to be when commenting on their findings. If a researcher makes claims which I consider to be over-the-top about the validity or impact of their findings, I often won’t cover.”
- “I also look at the difference between what an experiment actually shows versus the conclusion researchers draw from it — if there’s a big gap, that’s a huge red flag.”
Peters says the study’s findings show that “not only are journalists smart, but they have also gone out of their way to get educated about things that should matter.”
What other research shows about science journalists
A 2023 study, published in the International Journal of Communication, based on an online survey of 82 U.S. science journalists, aims to understand what they know and think about open-access research, including peer-reviewed journals and articles that don’t have a paywall, and preprints. Data was collected between October 2021 and February 2022. Preprints are scientific studies that have yet to be peer-reviewed and are shared on open repositories such as medRxiv and bioRxiv. The study finds that its respondents “are aware of OA and related issues and make conscious decisions around which OA scholarly articles they use as sources.”
A 2021 study, published in the Journal of Science Communication, looks at the impact of the COVID-19 pandemic on the work of science journalists. Based on an online survey of 633 science journalists from 77 countries, it finds that the pandemic somewhat brought scientists and science journalists closer together. “For most respondents, scientists were more available and more talkative,” the authors write. The pandemic has also provided an opportunity to explain the scientific process to the public, and remind them that “science is not a finished enterprise,” the authors write.
More than a decade ago, a 2008 study, published in PLOS Medicine, and based on an analysis of 500 health news stories, found that “journalists usually fail to discuss costs, the quality of the evidence, the existence of alternative options, and the absolute magnitude of potential benefits and harms,” when reporting on research studies. Giving time to journalists to research and understand the studies, giving them space for publication and broadcasting of the stories, and training them in understanding academic research are some of the solutions to fill the gaps, writes Gary Schwitzer, the study author.
Advice for journalists
We asked Bottesini, Peters, Aschwanden and Tamar Wilner, a postdoctoral fellow at the University of Texas, who was not involved in the study, to share advice for journalists who cover research studies. Wilner is conducting a study on how journalism research informs the practice of journalism. Here are their tips:
1. Examine the study before reporting it.
Does the study claim match the evidence? “One thing that makes me trust the paper more is if their interpretation of the findings is very calibrated to the kind of evidence that they have,” says Bottesini. In other words, if the study makes a claim in its results that’s far-fetched, the authors should present a lot of evidence to back that claim.
Not all surprising results are newsworthy. If you come across a surprising finding from a single study, Peters advises you to step back and remember Carl Sagan’s quote: “Extraordinary claims require extraordinary evidence.”
How transparent are the authors about their data? For instance, are the authors posting information such as their data and the computer codes they use to analyze the data on platforms such as Open Science Framework, AsPredicted, or The Dataverse Project? Some researchers ‘preregister’ their studies, which means they share how they’re planning to analyze the data before they see them. “Transparency doesn’t automatically mean that a study is trustworthy,” but it gives others the chance to double-check the findings, Bottesini says.
Look at the study design. Is it an experimental study or an observational study? Observational studies can show correlations but not causation.
“Observational studies can be very important for suggesting hypotheses and pointing us towards relationships and associations,” Aschwanden says.
Experimental studies can provide stronger evidence toward a cause, but journalists must still be cautious when reporting the results, she advises. “If we end up implying causality, then once it’s published and people see it, it can really take hold,” she says.
Know the difference between preprints and peer-reviewed, published studies. Peer-reviewed papers tend to be of higher quality than those that are not peer-reviewed. Read our tip sheet on the difference between preprints and journal articles.
Beware of predatory journals. Predatory journals are journals that “claim to be legitimate scholarly journals, but misrepresent their publishing practices,” according to a 2020 journal article, published in the journal Toxicologic Pathology, “Predatory Journals: What They Are and How to Avoid Them.”
2. Zoom in on data.
Read the methods section of the study. The methods section of the study usually appears after the introduction and background section. “To me, the methods section is almost the most important part of any scientific paper,” says Aschwanden. “It’s amazing to me how often you read the design and the methods section, and anyone can see that it’s a flawed design. So just giving things a gut-level check can be really important.”
What’s the sample size? Not all good studies have large numbers of participants but pay attention to the claims a study makes with a small sample size. “If you have a small sample, you calibrate your claims to the things you can tell about those people and don’t make big claims based on a little bit of evidence,” says Bottesini.
But also remember that factors such as sample size and p-value are not “as clear cut as some journalists might assume,” says Wilner.
How representative of a population is the study sample? “If the study has a non-representative sample of, say, undergraduate students, and they’re making claims about the general population, that’s kind of a red flag,” says Bottesini. Aschwanden points to the acronym WEIRD, which stands for “Western, Educated, Industrialized, Rich, and Democratic,” and is used to highlight a lack of diversity in a sample. Studies based on such samples may not be generalizable to the entire population, she says.
Look at the p-value. Statistical significance is both confusing and controversial, but it’s important to consider. Read our tip sheet, “5 Things Journalists Need to Know About Statistical Significance,” to better understand it.
3. Talk to scientists not involved in the study.
If you’re not sure about the quality of a study, ask for help. “Talk to someone who is an expert in study design or statistics to make sure that [the study authors] use the appropriate statistics and that methods they use are appropriate because it’s amazing to me how often they’re not,” says Aschwanden.
Get an opinion from an outside expert. It’s always a good idea to present the study to other researchers in the field, who have no conflicts of interest and are not involved in the research you’re covering and get their opinion. “Don’t take scientists at their word. Look into it. Ask other scientists, preferably the ones who don’t have a conflict of interest with the research,” says Bottesini.
4. Remember that a single study is simply one piece of a growing body of evidence.
“I have a general rule that a single study doesn’t tell us very much; it just gives us proof of concept,” says Peters. “It gives us interesting ideas. It should be retested. We need an accumulation of evidence.”
Aschwanden says as a practice, she tries to avoid reporting stories about individual studies, with some exceptions such as very large, randomized controlled studies that have been underway for a long time and have a large number of participants. “I don’t want to say you never want to write a single-study story, but it always needs to be placed in the context of the rest of the evidence that we have available,” she says.
Wilner advises journalists to spend some time looking at the scope of research on the study’s specific topic and learn how it has been written about and studied up to that point.
“We would want science journalists to be reporting balance of evidence, and not focusing unduly on the findings that are just in front of them in a most recent study,” Wilner says. “And that’s a very difficult thing to as journalists to do because they’re being asked to make their article very newsy, so it’s a difficult balancing act, but we can try and push journalists to do more of that.”
5. Remind readers that science is always changing.
“Science is always two steps forward, one step back,” says Peters. Give the public a notion of uncertainty, she advises. “This is what we know today. It may change tomorrow, but this is the best science that we know of today.”
Aschwanden echoes the sentiment. “All scientific results are provisional, and we need to keep that in mind,” she says. “It doesn’t mean that we can’t know anything, but it’s very important that we don’t overstate things.”
Authors of a study published in PNAS in January analyzed more than 14,000 psychology papers and found that replication success rates differ widely by psychology subfields. That study also found that papers that could not be replicated received more initial press coverage than those that could.
The authors note that the media “plays a significant role in creating the public’s image of science and democratizing knowledge, but it is often incentivized to report on counterintuitive and eye-catching results.”
Ideally, the news media would have a positive relationship with replication success rates in psychology, the authors of the PNAS study write. “Contrary to this ideal, however, we found a negative association between media coverage of a paper and the paper’s likelihood of replication success,” they write. “Therefore, deciding a paper’s merit based on its media coverage is unwise. It would be valuable for the media to remind the audience that new and novel scientific results are only food for thought before future replication confirms their robustness.”
Additional reading
Uncovering the Research Behaviors of Reporters: A Conceptual Framework for Information Literacy in Journalism
Katerine E. Boss, et al. Journalism & Mass Communication Educator, October 2022.The Problem with Psychological Research in the Media
Steven Stosny. Psychology Today, September 2022.Critically Evaluating Claims
Megha Satyanarayana, The Open Notebook, January 2022.How Should Journalists Report a Scientific Study?
Charles Binkley and Subramaniam Vincent. Markkula Center for Applied Ethics at Santa Clara University, September 2020.What Journalists Get Wrong About Social Science: Full Responses
Brian Resnick. Vox, January 2016.From The Journalist’s Resource
8 Ways Journalists Can Access Academic Research for Free
5 Things Journalists Need to Know About Statistical Significance
5 Common Research Designs: A Quick Primer for Journalists
5 Tips for Using PubPeer to Investigate Scientific Research Errors and Misconduct
What’s Standard Deviation? 4 Things Journalists Need to Know
This article first appeared on The Journalist’s Resource and is republished here under a Creative Commons license.
Don’t Forget…
Did you arrive here from our newsletter? Don’t forget to return to the email to continue learning!
Learn to Age Gracefully
Join the 98k+ American women taking control of their health & aging with our 100% free (and fun!) daily emails:
-
Superfood Pesto Pizza
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
Not only is this pizza full of foods that punch above their weight healthwise, there’s no kneading and no waiting when it comes to the base, either. Homemade pizzas made easy!
You will need
For the topping:
- 1 zucchini, sliced
- 1 red bell pepper, cut into strips
- 3 oz mushrooms, sliced
- 3 shallots, cut into quarters
- 6 sun-dried tomatoes, roughly chopped
- ½ bulb garlic (paperwork done, but cloves left intact, unless they are very large, in which case halve them)
- 1 oz pitted black olives, halved
- 1 handful arugula
- 1 tbsp extra virgin olive oil
- 2 tsp black pepper, coarse ground
- ½ tsp MSG or 1 tsp low-sodium salt
For the base:
- ½ cup chickpea flour (also called besan or gram flour)
- 2 tsp extra virgin olive oil
- ½ tsp baking powder
- ⅛ tsp MSG or ¼ tsp low-sodium salt
For the pesto sauce:
- 1 large bunch basil, chopped
- ½ avocado, pitted and peeled
- 1 oz pine nuts
- ¼ bulb garlic, crushed
- 2 tbsp nutritional yeast
- 1 tsp black pepper
- Juice of ½ lemon
Method
(we suggest you read everything at least once before doing anything)
1) Preheat the oven to 400℉ / 200℃.
2) Toss the zucchini, bell pepper, mushrooms, shallots, and garlic cloves in 1 tbsp olive oil, ensuring an even coating. Season with the black pepper and MSG/salt, and put on a baking tray lined with baking paper, to roast for about 20 minutes, until they are slightly charred.
3) When the vegetables are in the oven, make the pizza base by combining the dry ingredients in a bowl, making a pit in the middle of it, adding the olive oil and whisking it in, and then slowly (i.e., a little bit at a time) whisking in 1 cup cold water. This should take under 5 minutes.
4) Don’t panic when this doesn’t become a dough; it is supposed to be a thick batter, so that’s fine. Pour it into a 9″ pizza pan, and bake for about 15 minutes, until firm. Rotate it if necessary partway through; whether it needs this or not will depend on your oven.
5) While the pizza base is in the oven, make the pesto sauce by blending all the pesto sauce ingredients in a high-speed blender until smooth.
6) When the base and vegetables are ready (these should be finished around the same time), spread the pesto sauce on the base, scatter the arugula over it followed by the vegetables and then the olives and sun-dried tomatoes.
7) Serve, adding any garnish or other final touches that take your fancy.
Enjoy!
Want to learn more?
For those interested in some of the science of what we have going on today:
- Which Bell Peppers To Pick? A Spectrum Of Specialties
- Ergothioneine In Mushrooms: “The Longevity Vitamin” (That’s Not A Vitamin)
- Black Olives vs Green Olives – Which is Healthier
- Lycopene’s Benefits For The Gut, Heart, Brain, & More
- Coconut vs Avocado – Which is Healthier?
- Herbs for Evidence-Based Health & Healing
- Spermidine For Longevity
Take care!
Don’t Forget…
Did you arrive here from our newsletter? Don’t forget to return to the email to continue learning!
Learn to Age Gracefully
Join the 98k+ American women taking control of their health & aging with our 100% free (and fun!) daily emails: