Lime-Charred Cauliflower Popcorn
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
Called “popcorn” for its appearance and tasty-snackness, this one otherwise bears little relation to the usual movie theater snack, and it’s both tastier and healthier. All that said, it can be eaten on its own as a snack (even with a movie, if you so wish), or served as one part of a many-dish banquet, or (this writer’s favorite) as a delicious appetizer that also puts down a healthy bed of fiber ready for the main course to follow it.
You will need
- 1 cauliflower, cut into small (popcorn-sized) florets
- 2 tbsp extra virgin olive oil
- 1 tbsp lime pickle
- 1 tsp cumin seeds
- 1 tsp smoked paprika
- 1 tsp chili flakes
- 1 tsp black pepper, coarse ground
- ½ tsp ground turmeric
Method
(we suggest you read everything at least once before doing anything)
1) Preheat your oven as hot as it will go
2) Mix all the ingredients in a small bowl except the cauliflower, to form a marinade
3) Drizzle the marinade over the cauliflower in a larger bowl (i.e. big enough for the cauliflower), and mix well until the cauliflower is entirely, or at least almost entirely, coated. Yes, it’s not a lot of marinade but unless you picked a truly huge cauliflower, the proportions we gave will be enough, and you want the end result to be crisp, not dripping.
4) Spread the marinaded cauliflower florets out on a baking tray lined with baking paper. Put it in the oven on the middle shelf, so it doesn’t cook unevenly, but keeping the temperature as high as it goes.
5) When it is charred and crispy golden, it’s done—this should take about 20 minutes, but we’ll say ±5 minutes depending on your oven, so do check on it periodically—and time to serve (it is best enjoyed warm).
Enjoy!
Want to learn more?
For those interested in some of the science of what we have going on today:
- We must do a main feature on the merits of cruciferous vegetables! Watch this space.
- All About Olive Oils (Extra Virgin & Otherwise)
- Capsaicin For Weight Loss And Against Inflammation
- Black Pepper’s Impressive Anti-Cancer Arsenal (And More)
- Why Curcumin (Turmeric) Is Worth Its Weight In Gold
Take care!
Don’t Forget…
Did you arrive here from our newsletter? Don’t forget to return to the email to continue learning!
Recommended
Learn to Age Gracefully
Join the 98k+ American women taking control of their health & aging with our 100% free (and fun!) daily emails:
-
Women Rowing North – by Dr. Mary Pipher
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
Ageism is rife, as is misogyny. And those can be internalized too, and compounded as they intersect.
Clinical psychologist Dr. Mary Pipher, herself 75, writes for us a guidebook of, as the subtitle goes, “navigating life’s currents and flourishing as we age”.
The book does assume, by the way, that the reader is…
- a woman, and
- getting old (if not already old)
However, the lessons the book imparts are vital for women of any age, and valuable as a matter of insight and perspective for any reader.
Dr. Pipher takes us on a tour of aging as a woman, and what parts of it we can make our own, do things our way, and take what joy we can from it.
Nor is the book given to “toxic positivity” though—it also deals with themes of hardship, frustration, and loss.
When it comes to those elements, the book is… honest, human, and raw. But also, an exhortation to hope, beauty, and a carpe diem attitude.
Bottom line: this book is highly recommendable to anyone of any age; life is precious and can be short. And be we blessed with many long years, this book serves as a guide to making each one of them count.
Click here to check out Women Rowing North—it really is worth it
Share This Post
-
We created a VR tool to test brain function. It could one day help diagnose dementia
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
If you or a loved one have noticed changes in your memory or thinking as you’ve grown older, this could reflect typical changes that occur with ageing. In some cases though, it might suggest something more, such as the onset of dementia.
The best thing to do if you have concerns is to make an appointment with your GP, who will probably run some tests. Assessment is important because if there is something more going on, early diagnosis can enable prompt access to the right interventions, supports and care.
But current methods of dementia screening have limitations, and testing can be daunting for patients.
Our research suggests virtual reality (VR) could be a useful cognitive screening tool, and mitigate some of the challenges associated with current testing methods, opening up the possibility it may one day play a role in dementia diagnosis.
Where current testing is falling short
If someone is worried about their memory and thinking, their GP might ask them to complete a series of quick tasks that check things like the ability to follow simple instructions, basic arithmetic, memory and orientation.
These sorts of screening tools are really good at confirming cognitive problems that may already be very apparent. But commonly used screening tests are not always so good at detecting early and more subtle difficulties with memory and thinking, meaning such changes could be missed until they get worse.
A clinical neuropsychological assessment is better equipped to detect early changes. This involves a comprehensive review of a patient’s personal and medical history, and detailed assessment of cognitive functions, including attention, language, memory, executive functioning, mood factors and more. However, this can be costly and the testing can take several hours.
Testing is also somewhat removed from everyday experience, not directly tapping into activities of daily living.
Enter virtual reality
VR technology uses computer-generated environments to create immersive experiences that feel like real life. While VR is often used for entertainment, it has increasingly found applications in health care, including in rehabilitation and falls prevention.
Using VR for cognitive screening is still a new area. VR-based cognitive tests generally create a scenario such as shopping at a supermarket or driving around a city to ascertain how a person would perform in these situations.
Notably, they engage various senses and cognitive processes such as sight, sound and spatial awareness in immersive ways. All this may reveal subtle impairments which can be missed by standard methods.
VR assessments are also often more engaging and enjoyable, potentially reducing anxiety for those who may feel uneasy in traditional testing environments, and improving compliance compared to standard assessments.
Millions of people around the world have dementia.
pikselstock/ShutterstockMost studies of VR-based cognitive tests have explored their capacity to pick up impairments in spatial memory (the ability to remember where something is located and how to get there), and the results have been promising.
Given VR’s potential for assisting with diagnosis of cognitive impairment and dementia remains largely untapped, our team developed an online computerised game (referred to as semi-immersive VR) to see how well a person can remember, recall and complete everyday tasks. In our VR game, which lasts about 20 minutes, the user role plays a waiter in a cafe and receives a score on their performance.
To assess its potential, we enlisted more than 140 people to play the game and provide feedback. The results of this research are published across three recent papers.
Testing our VR tool
In our most recently published study, we wanted to verify the accuracy and sensitivity of our VR game to assess cognitive abilities.
We compared our test to an existing screening tool (called the TICS-M) in more than 130 adults. We found our VR task was able to capture meaningful aspects of cognitive function, including recalling food items and spatial memory.
We also found younger adults performed better in the game than older adults, which echoes the pattern commonly seen in regular memory tests.
Adults of a range of ages tried our computerised game.
pikselstock/ShutterstockIn a separate study, we followed ten adults aged over 65 while they completed the game, and interviewed them afterwards. We wanted to understand how this group – who the tool would target – perceived the task.
These seniors told us they found the game user-friendly and believed it was a promising tool for screening memory. They described the game as engaging and immersive, expressing enthusiasm to continue playing. They didn’t find the task created anxiety.
For a third study, we spoke to seven health-care professionals about the tool. Overall they gave positive feedback, and noted its dynamic approach to age-old diagnostic challenges.
However, they did flag some concerns and potential barriers to implementing this sort of tool. These included resource constraints in clinical practice (such as time and space to carry out the assessment) and whether it would be accessible for people with limited technological skills. There was also some scepticism about whether the tool would be an accurate method to assist with dementia diagnosis.
While our initial research suggests this tool could be a promising way to assess cognitive performance, this is not the same as diagnosing dementia. To improve the test’s ability to accurately detect those who likely have dementia, we’ll need to make it more specific for that purpose, and carry out further research to validate its effectiveness.
We’ll be conducting more testing of the game soon. Anyone interested in giving it a go to help with our research can register on our team’s website.
Joyce Siette, Research Theme Fellow in Health and Wellbeing, Western Sydney University and Paul Strutt, Senior Lecturer in Psychology, Western Sydney University
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Share This Post
-
The Brain As A Work-In-Progress
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
And The Brain Goes Marching On!
In Tuesday’s newsletter, we asked you “when does the human brain stop developing?” and got the above-depicted, below-described, set of responses:
- About 64% of people said “Never”
- About 16% of people said “25 years”
- About 9% of people said “65 years”
- About 5% of people said “13 years”
- About 3% of people said “18 years”
- About 3% of people said “45 years”
Some thoughts, before we get into the science:
An alternative wording for the original question was “when does the human brain finish developing”; the meaning is the same but the feeling is slightly different:
- “When does the human brain stop developing?” focuses attention on the idea of cessation, and will skew responses to later ages
- When does the human brain finish developing?” focuses on attention on a kind of “is it done yet?” and will skew responses to earlier ages
Ultimately, since we had to chose one word or another, we picked the shortest one, but it would have been interesting if we could have done an A/B test, and asked half one way, and half the other way!
Why we picked those ages
We picked those ages as poll options for reasons people might be drawn to them:
- 13 years: in English-speaking cultures, an important milestone of entering adolescence (note that the concept of a “teenager” is not precisely universal as most languages do not have “-teen” numbers in the same way; the concept of “adolescent” may thus be tied to other milestones)
- 18 years: age of legal majority in N. America and many other places
- 25 years: age popularly believed to be when the brain is finished developing, due to a study that we’ll talk about shortly (we guess that’s why there’s a spike in our results for this, too!)
- 45 years: age where many midlife hormonal changes occur, and many professionals are considered to have peaked in competence and start looking towards retirement
- 65 years: age considered “senior” in much of N. America and many other places, as well as the cut-off and/or starting point for a lot of medical research
Notice, therefore, how a lot of things are coming from places they really shouldn’t. For example, because there are many studies saying “n% of people over 65 get Alzheimer’s” or “n% of people over 65 get age-related cognitive decline”, etc, 65 becomes the age where we start expecting this—because of an arbitrary human choice of where to draw the cut-off for the study enrollment!
Similarly, we may look at common ages of legal majority, or retirement pensions, and assume “well it must be for a good reason”, and dear reader, those reasons are more often economically motivated than they are biologically reasoned.
So, what does the science say?
Our brains are never finished developing: True or False?
True! If we define “finished developing” as “we cease doing neurogenesis and neuroplasticity is no longer in effect”.
Glossary:
- Neurogenesis: the process of creating new brain cells
- Neuroplasticity: the process of the brain adapting to changes by essentially rebuilding itself to suit our perceived current needs
We say “perceived” because sometimes neuroplasticity can do very unhelpful things to us (e.g: psychological trauma, or even just bad habits), but on a biological level, it is always doing its best to serve our overall success as an organism.
For a long time it was thought that we don’t do neurogenesis at all as adults, but this was found to be untrue:
How To Grow New Brain Cells (At Any Age)
Summary of conclusions of the above: we’re all growing new brain cells at every age, even if we be in our 80s and with Alzheimer’s disease, but there are things we can do to enhance our neurogenic potential along the way.
Neuroplasticity will always be somewhat enhanced by neurogenesis (after all, new neurons get given jobs to do), and we reviewed a great book about the marvels of neuroplasticity including in older age:
Our brains are still developing up to the age of 25: True or False?
True! And then it keeps on developing after that, too. Now this is abundantly obvious considering what we just talked about, but see what a difference the phrasing makes? Now it makes it sound like it stops at 25, which this statement doesn’t claim at all—it only speaks for the time up to that age.
A lot of the popular press about “the brain isn’t fully mature until the age of 25” stems from a 2006 study that found:
❝For instance, frontal gray matter volume peaks at about age 11.0 years in girls and 12.1 years in boys, whereas temporal gray matter volume peaks at about age at 16.7 years in girls and 16.2 years in boys. The dorsal lateral prefrontal cortex, important for controlling impulses, is among the latest brain regions to mature without reaching adult dimensions until the early 20s.❞
Source: Structural Magnetic Resonance Imaging of the Adolescent Brain
There are several things to note here:
- The above statement is talking about the physical size of the brain growing
- Nowhere does he say “and stops developing at 25”
However… The study only looked at brains up to the age of 25. After that, they stopped looking, because the study was about “the adolescent brain” so there has to be a cut-off somewhere, and that was the cut-off they chose.
This is the equivalent of saying “it didn’t stop raining until four o’clock” when the reality is that four o’clock is simply when you gave up on checking.
The study didn’t misrepresent this, by the way, but the popular press did!
Another 2012 study looked at various metrics of brain development, and found:
- Synapse overproduction into the teens
- Cortex pruning into the late 20s
- Prefrontal pruning into middle age at least (they stopped looking)
- Myelination beyond middle age (they stopped looking)
Source: Experience and the developing prefrontal cortex ← check out figure 1, and make sure you’re looking at the human data not the rat data
So how’s the most recent research looking?
Here’s a 2022 study that looked at 123,984 brain scans spanning the age range from mid-gestation to 100 postnatal years, and as you can see from its own figure 1… Most (if not all) brain-things keep growing for life, even though most slow down at some point, they don’t stop:
Brain charts for the human lifespan ← check out figure 1; don’t get too excited about the ventricular volume column as that is basically “brain that isn’t being a brain”. Do get excited about the rest, though!
Want to know how not to get caught out by science being misrepresented by the popular press? Check out:
How Science News Outlets Can Lie To You (Yes, Even If They Cite Studies!)
Take care!
Share This Post
Related Posts
-
Codependency Isn’t What Most People Think It Is
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
Codependency isn’t what most people think it is
In popular parlance, people are often described as “codependent” when they rely on each other to function normally. That’s interdependent mutualism, and while it too can become a problem if a person is deprived of their “other half” and has no idea how to do laundry and does not remember to take their meds, it’s not codependency.
Codependency finds its origins in the treatment and management of alcoholism, and has been expanded to encompass other forms of relationships with dependence on substances and/or self-destructive behaviors—which can be many things, including the non-physical, for example a pattern of irresponsible impulse-spending, or sabotaging one’s own relationship(s).
We’ll use the simplest example, though:
- Person A is (for example) an alcoholic. They have a dependency.
- Person B, married to A, is not an alcoholic. However, their spouse’s dependency affects them greatly, and they do what they can to manage that, and experience tension between wanting to “save” their spouse, and wanting their spouse to be ok, which latter, superficially, often means them having their alcohol.
Person B is thus said to be “codependent”.
The problem with codependency
The problems of codependency are mainly twofold:
- The dependent partner’s dependency is enabled and thus perpetuated by the codependent partner—they might actually have to address their dependency, if it weren’t for their partner keeping them from too great a harm (be it financially, socially, psychologically, medically, whatever)
- The codependent partner is not having a good time of it either. They have the stress of two lives with the resources (e.g. time) of one. They are stressing about something they cannot control, understandably worrying about their loved one, and, worse: every action they might take to “save” their loved one by reducing the substance use, is an action that makes their partner unhappy, and causes conflict too.
Note: codependency is often a thing in romantic relationships, but it can appear in other relationships too, e.g. parent-child, or even between friends.
See also: Development and validation of a revised measure of codependency
How to deal with this
If you find yourself in a codependent position, or are advising someone who is, there are some key things that can help:
- Be a nurturer, not a rescuer. It is natural to want to “rescue” someone we care about, but there are some things we cannot do for them. Instead, we must look for ways to build their strength so that they can take the steps that only they can take to fix the problem.
- Establish boundaries. Practise saying “no”, and also be clear over what things you can and cannot control—and let go of the latter. Communicate this, though. An “I’m not the boss of you” angle can prompt a lot of people to take more personal responsibility.
- Schedule time for yourself. You might take some ideas from our previous tangentially-related article:
How To Avoid Carer Burnout (Without Dropping Care)
Want to read more?
That’s all we have space for today, but here’s a very useful page with a lot of great resources (including questionnaires and checklist and things, in case you’re thinking “is it, or…?”)
Don’t Forget…
Did you arrive here from our newsletter? Don’t forget to return to the email to continue learning!
Learn to Age Gracefully
Join the 98k+ American women taking control of their health & aging with our 100% free (and fun!) daily emails:
-
Your Simplest Life – by Lisa Turner
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
We probably know how to declutter, and perhaps even do a “unnecessary financial expenditures” audit. So, what does this offer beyond that?
A large portion of this book focuses on keeping our general life in a state of “flow”, and strategies include:
- How to make sure you’re doing the right part of the 80:20 split on a daily basis
- Knowing when to switch tasks, and when not to
- Knowing how to plan time for tasks
- No more reckless optimism, but also without falling foul of Parkinson’s Law (i.e. work expands to fill the time allotted to it)
- Decluttering your head, too!
When it comes to managing life responsibilities in general, Turner is very attuned to generational differences… Including the different challenges faced by each generation, what’s more often expected of us, what we’re used to, and how we probably initially learned to do it (or not).
To this end, a lot of strategies are tailored with variations for each age group. Not often does an author take the time to address each part of their readership like that, and it’s really helpful that she does!
All in all, a great book for simplifying your daily life.
Don’t Forget…
Did you arrive here from our newsletter? Don’t forget to return to the email to continue learning!
Learn to Age Gracefully
Join the 98k+ American women taking control of their health & aging with our 100% free (and fun!) daily emails:
-
Kale vs Watercress – Which is Healthier?
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
Our Verdict
When comparing kale to watercress, we picked the kale.
Why?
It was very close! If ever we’ve been tempted to call something a tie, this has been the closest so far.
Their macros are close; watercress has a tiny amount more protein and slightly lower carbs, but these numbers are tiny, so it’s not really a factor. Nevertheless, on macros alone we’d call this a slight nominal win for watercress.
In terms of vitamins, they’re even. Watercress has higher vitamin E and choline (sometimes considered a vitamin), as well as being higher in some B vitamins. Kale has higher vitamins A and K, as well as being higher in some other B vitamins.
In the category of minerals, watercress has higher calcium, magnesium, phosphorus, and potassium, while kale has higher copper, iron, manganese, and zinc. The margins are slightly wider for kale’s more plentiful minerals though, so we’ll call this section a marginal win for kale.
When it comes to polyphenols, kale takes and maintains the lead here, with around 2x the quercetin and 27x the kaempferol. Watercress does have some lignans that kale doesn’t, but ultimately, kale’s strong flavonoid content keeps it in the lead.
So of course: enjoy both if both are available! But if we must pick one, it’s kale.
Want to learn more?
You might like to read:
- Fight Inflammation & Protect Your Brain, With Quercetin
- Spinach vs Kale – Which is Healthier?
- Thai-Style Kale Chips (recipe)
Take care!
Don’t Forget…
Did you arrive here from our newsletter? Don’t forget to return to the email to continue learning!
Learn to Age Gracefully
Join the 98k+ American women taking control of their health & aging with our 100% free (and fun!) daily emails: