Healthy Tiramisu
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
Tiramisu (literally “pick-me-up”, “tira-mi-su”) is a delightful dish that, in its traditional form, is also a trainwreck for the health, being loaded with inflammatory cream and sugar, not to mention the cholesterol content. Here we recreate the dish in healthy fashion, being loaded with protein, fiber, and healthy fats, not to mention that the optional sweetener is an essential amino acid. The coffee and cocoa, of course, are full of antioxidants too. All in all, what’s not celebrate?
You will need
- 2 cups silken tofu (no need to press it) (do not substitute with any other kind of tofu or it will not work)
- 1 cup oat cream (you can buy this ready-made, or make it yourself by blending oats in water until you get the desired consistency) (you can also just use dairy cream, but that will be less healthy)
- 1 cup almond flour (also simply called “ground almonds”)
- 1 cup espresso ristretto, or otherwise the strongest black coffee you have facility to make
- ¼ cup unsweetened cocoa powder, plus more for dusting
- 1 pack savoiardi biscuits, also called “ladyfinger” biscuits (this was the only part we couldn’t make healthy—if you figure out a way to make it healthy, let us know!) (if vegan, obviously use a vegan substitute biscuit; this writer uses Lotus/Biscoff biscuits, which work well)
- 1 tsp vanilla essence
- ½ tsp almond essence
- Optional: glycine, per taste
- Garnish: roasted coffee beans
Method
(we suggest you read everything at least once before doing anything)
1) Add glycine to the coffee first if you want the overall dish to be sweeter. Glycine has approximately the same sweetness as sugar, and can be used as a 1:1 substitution. Use that information as you see fit.
2) Blend the tofu and the cream together in a high-speed blender until smooth. It should have a consistency like cake-batter; if it is too liquidy, add small amounts of almond flour until it is thicker. If it’s too thick, add oat cream until it isn’t. If you want it to be sweeter than it is, add glycine to taste. When happy with its taste and consistency, divide it evenly into two bowls.
3) Add the vanilla essence and almond essence to one bowl, and the cocoa powder to the other, mixing well (in a food processor, or just by using a whisk)
4) Coat the base of a glass dish (such as a Pyrex oven dish, but any dish is fine, and any glass dish will allow for viewing the pretty layers we’ll be making) with a very thin layer of almond flour (if you want sweetness there, you can mix some glycine in with the almond flour first).
4) One by one, soak the biscuits briefly in the coffee, and use them to line to base of the dish.
5) Add a thin layer of chocolate cream, ensuring the surface is as flat as possible. Dust it with cocoa powder, to increase the surface tension.
6) Add a thin layer of vanilla-and-almond cream, ensuring the surface is as flat as possible. Dust it with cocoa powder, to increase the surface tension.
7) Stop and assess: do you have enough ingredients left to repeat these layers? It will depend on the size and shape dish you used. If you do, repeat them, finishing with a vanilla-and-almond cream layer.
8) Dust the final layer with cocoa powder if you haven’t already, and add the coffee bean garnish, if using.
9) Refrigerate for at least 8 hours, and if you have time to prepare it the day before you will eat it, that is best of all.
Enjoy!
Want to learn more?
For those interested in some of the science of what we have going on today:
- Easily Digestible Vegetarian Protein Sources
- Why You Should Diversify Your Nuts!
- The Bitter Truth About Coffee (or is it?)
- The Sweet Truth About Glycine
- Tiramisu Crunch Bites ← craving tiramisu but not keen on all that effort? Enjoy these!
Take care!
Don’t Forget…
Did you arrive here from our newsletter? Don’t forget to return to the email to continue learning!
Recommended
Learn to Age Gracefully
Join the 98k+ American women taking control of their health & aging with our 100% free (and fun!) daily emails:
-
Apples vs Oranges – Which is Healthier?
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
Our Verdict
When comparing apples to oranges, we picked the oranges.
Why?
In terms of macros, the two fruits are approximately equal (and indeed, on average, precisely equal in the most important metric, which is fiber). So, a tie here.
In the category of vitamins, apples are higher in vitamin K, while oranges are higher in vitamins A, B1, B2, B3, B5, B6, B7, B9, C, and choline. An easy win for oranges this time.
When it comes to minerals, apples have more iron and manganese, while oranges have more calcium, copper, magnesium, phosphorus, potassium, selenium, and zinc. Another easy win for oranges.
So, adding up the sections, a clear win for oranges. But, by all means, enjoy either or both! Diversity is good.
Want to learn more?
You might like to read:
From Apples to Bees, and High-Fructose Cs: Which Sugars Are Healthier, And Which Are Just The Same?
Take care!
Share This Post
-
Mental illness, psychiatric disorder or psychological problem. What should we call mental distress?
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
We talk about mental health more than ever, but the language we should use remains a vexed issue.
Should we call people who seek help patients, clients or consumers? Should we use “person-first” expressions such as person with autism or “identity-first” expressions like autistic person? Should we apply or avoid diagnostic labels?
These questions often stir up strong feelings. Some people feel that patient implies being passive and subordinate. Others think consumer is too transactional, as if seeking help is like buying a new refrigerator.
Advocates of person-first language argue people shouldn’t be defined by their conditions. Proponents of identity-first language counter that these conditions can be sources of meaning and belonging.
Avid users of diagnostic terms see them as useful descriptors. Critics worry that diagnostic labels can box people in and misrepresent their problems as pathologies.
Underlying many of these disagreements are concerns about stigma and the medicalisation of suffering. Ideally the language we use should not cast people who experience distress as defective or shameful, or frame everyday problems of living in psychiatric terms.
Our new research, published in the journal PLOS Mental Health, examines how the language of distress has evolved over nearly 80 years. Here’s what we found.
Generic terms for the class of conditions
Generic terms – such as mental illness, psychiatric disorder or psychological problem – have largely escaped attention in debates about the language of mental ill health. These terms refer to mental health conditions as a class.
Many terms are currently in circulation, each an adjective followed by a noun. Popular adjectives include mental, mental health, psychiatric and psychological, and common nouns include condition, disease, disorder, disturbance, illness, and problem. Readers can encounter every combination.
These terms and their components differ in their connotations. Disease and illness sound the most medical, whereas condition, disturbance and problem need not relate to health. Mental implies a direct contrast with physical, whereas psychiatric implicates a medical specialty.
Mental health problem, a recently emerging term, is arguably the least pathologising. It implies that something is to be solved rather than treated, makes no direct reference to medicine, and carries the positive connotations of health rather than the negative connotation of illness or disease.
Arguably, this development points to what cognitive scientist Steven Pinker calls the “euphemism treadmill”, the tendency for language to evolve new terms to escape (at least temporarily) the offensive connotations of those they replace.
English linguist Hazel Price argues that mental health has increasingly come to replace mental illness to avoid the stigma associated with that term.
How has usage changed over time?
In the PLOS Mental Health paper, we examine historical changes in the popularity of 24 generic terms: every combination of the nouns and adjectives listed above.
We explore the frequency with which each term appears from 1940 to 2019 in two massive text data sets representing books in English and diverse American English sources, respectively. The findings are very similar in both data sets.
The figure presents the relative popularity of the top ten terms in the larger data set (Google Books). The 14 least popular terms are combined into the remainder.
Several trends appear. Mental has consistently been the most popular adjective component of the generic terms. Mental health has become more popular in recent years but is still rarely used.
Among nouns, disease has become less widely used while illness has become dominant. Although disorder is the official term in psychiatric classifications, it has not been broadly adopted in public discourse.
Since 1940, mental illness has clearly become the preferred generic term. Although an assortment of alternatives have emerged, it has steadily risen in popularity.
Does it matter?
Our study documents striking shifts in the popularity of generic terms, but do these changes matter? The answer may be: not much.
One study found people think mental disorder, mental illness and mental health problem refer to essentially identical phenomena.
Other studies indicate that labelling a person as having a mental disease, mental disorder, mental health problem, mental illness or psychological disorder makes no difference to people’s attitudes toward them.
We don’t yet know if there are other implications of using different generic terms, but the evidence to date suggests they are minimal.
Is ‘distress’ any better?
Recently, some writers have promoted distress as an alternative to traditional generic terms. It lacks medical connotations and emphasises the person’s subjective experience rather than whether they fit an official diagnosis.
Distress appears 65 times in the 2022 Victorian Mental Health and Wellbeing Act, usually in the expression “mental illness or psychological distress”. By implication, distress is a broad concept akin to but not synonymous with mental ill health.
But is distress destigmatising, as it was intended to be? Apparently not. According to one study, it was more stigmatising than its alternatives. The term may turn us away from other people’s suffering by amplifying it.
So what should we call it?
Mental illness is easily the most popular generic term and its popularity has been rising. Research indicates different terms have little or no effect on stigma and some terms intended to destigmatise may backfire.
We suggest that mental illness should be embraced and the proliferation of alternative terms such as mental health problem, which breed confusion, should end.
Critics might argue mental illness imposes a medical frame. Philosopher Zsuzsanna Chappell disagrees. Illness, she argues, refers to subjective first-person experience, not to an objective, third-person pathology, like disease.
Properly understood, the concept of illness centres the individual and their connections. “When I identify my suffering as illness-like,” Chappell writes, “I wish to lay claim to a caring interpersonal relationship.”
As generic terms go, mental illness is a healthy option.
Nick Haslam, Professor of Psychology, The University of Melbourne and Naomi Baes, Researcher – Social Psychology/ Natural Language Processing, The University of Melbourne
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Share This Post
-
What’s the difference between ADD and ADHD?
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
Around one in 20 people has attention-deficit hyperactivity disorder (ADHD). It’s one of the most common neurodevelopmental disorders in childhood and often continues into adulthood.
ADHD is diagnosed when people experience problems with inattention and/or hyperactivity and impulsivity that negatively impacts them at school or work, in social settings and at home.
Some people call the condition attention-deficit disorder, or ADD. So what’s the difference?
In short, what was previously called ADD is now known as ADHD. So how did we get here?
Let’s start with some history
The first clinical description of children with inattention, hyperactivity and impulsivity was in 1902. British paediatrician Professor George Still presented a series of lectures about his observations of 43 children who were defiant, aggressive, undisciplined and extremely emotional or passionate.
Since then, our understanding of the condition evolved and made its way into the Diagnostic and Statistical Manual of Mental Disorders, known as the DSM. Clinicians use the DSM to diagnose mental health and neurodevelopmental conditions.
The first DSM, published in 1952, did not include a specific related child or adolescent category. But the second edition, published in 1968, included a section on behaviour disorders in young people. It referred to ADHD-type characteristics as “hyperkinetic reaction of childhood or adolescence”. This described the excessive, involuntary movement of children with the disorder.
In the early 1980s, the third DSM added a condition it called “attention deficit disorder”, listing two types: attention deficit disorder with hyperactivity (ADDH) and attention deficit disorder as the subtype without the hyperactivity.
However, seven years later, a revised DSM (DSM-III-R) replaced ADD (and its two sub-types) with ADHD and three sub-types we have today:
- predominantly inattentive
- predominantly hyperactive-impulsive
- combined.
Why change ADD to ADHD?
ADHD replaced ADD in the DSM-III-R in 1987 for a number of reasons.
First was the controversy and debate over the presence or absence of hyperactivity: the “H” in ADHD. When ADD was initially named, little research had been done to determine the similarities and differences between the two sub-types.
The next issue was around the term “attention-deficit” and whether these deficits were similar or different across both sub-types. Questions also arose about the extent of these differences: if these sub-types were so different, were they actually different conditions?
Meanwhile, a new focus on inattention (an “attention deficit”) recognised that children with inattentive behaviours may not necessarily be disruptive and challenging but are more likely to be forgetful and daydreamers.
Why do some people use the term ADD?
There was a surge of diagnoses in the 1980s. So it’s understandable that some people still hold onto the term ADD.
Some may identify as having ADD because out of habit, because this is what they were originally diagnosed with or because they don’t have hyperactivity/impulsivity traits.
Others who don’t have ADHD may use the term they came across in the 80s or 90s, not knowing the terminology has changed.
How is ADHD currently diagnosed?
The three sub-types of ADHD, outlined in the DSM-5 are:
- predominantly inattentive. People with the inattentive sub-type have difficulty sustaining concentration, are easily distracted and forgetful, lose things frequently, and are unable to follow detailed instructions
- predominantly hyperactive-impulsive. Those with this sub-type find it hard to be still, need to move constantly in structured situations, frequently interrupt others, talk non-stop and struggle with self control
- combined. Those with the combined sub-type experience the characteristics of those who are inattentive and hyperactive-impulsive.
ADHD diagnoses continue to rise among children and adults. And while ADHD was commonly diagnosed in boys, more recently we have seen growing numbers of girls and women seeking diagnoses.
However, some international experts contest the expanded definition of ADHD, driven by clinical practice in the United States. They argue the challenges of unwanted behaviours and educational outcomes for young people with the condition are uniquely shaped by each country’s cultural, political and local factors.
Regardless of the name change to reflect what we know about the condition, ADHD continues to impact educational, social and life situations of many children, adolescents and adults.
Kathy Gibbs, Program Director for the Bachelor of Education, Griffith University
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Share This Post
Related Posts
-
Our family is always glued to separate devices. How can we connect again?
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
It’s Saturday afternoon and the kids are all connected to separate devices. So are the parents. Sounds familiar?
Many families want to set ground rules to help them reduce their screen time – and have time to connect with each other, without devices.
But it can be difficult to know where to start and how to make a plan that suits your family.
First, look at your own screen time
Before telling children to “hop off the tech”, it’s important parents understand how much they are using screens themselves.
Globally, the average person spends an average of six hours and 58 minutes on screens each day. This has increased by 13%, or 49 minutes, since 2013.
Parents who report high screen time use tend to see this filtering down to the children in their family too. Two-thirds of primary school-aged children in Australia have their own mobile screen-based device.
Australia’s screen time guidelines recommended children aged five to 17 years have no more than two hours of sedentary screen time (excluding homework) each day. For those aged two to five years, it’s no more than one hour a day. And the guidelines recommend no screen time at all for children under two.
Yet the majority of children, across age groups, exceed these maximums. A new Australian study released this week found the average three-year-old is exposed to two hours and 52 minutes of screen time a day.
Some screen time is OK, too much increases risks
Technology has profoundly impacted children’s lives, offering both opportunities and challenges.
On one hand, it provides access to educational resources, can develop creativity, facilitates communication with peers and family members, and allows students to seek out new information.
On the other hand, excessive screen use can result in too much time being sedentary, delays in developmental milestones, disrupted sleep and daytime drowsiness.
Too much screen time can affect social skills, as it replaces time spent in face-to-face social interactions. This is where children learn verbal and non-verbal communication, develop empathy, learn patience and how to take turns.
Many families also worry about how to maintain a positive relationship with their children when so much of their time is spent glued to screens.
What about when we’re all on devices?
When families are all using devices simultaneously, it results in less face-to-face interactions, reducing communication and resulting in a shift in family dynamics.
The increased use of wireless technology enables families to easily tune out from each other by putting in earphones, reducing the opportunity for conversation. Family members wearing earphones during shared activities or meals creates a physical barrier and encourages people to retreat into their own digital worlds.
Wearing earphones for long periods may also reduce connection to, and closeness with, family members. Research from video gaming, for instance, found excessing gaming increases feelings of isolation, loneliness and the displacement of real-world social interactions, alongside weakened relationships with peers and family members.
How can I set screen time limits?
Start by sitting down as a family and discussing what limits you all feel would be appropriate when using TVs, phones and gaming – and when is an appropriate time to use them.
Have set rules around family time – for example, no devices at the dinner table – so you can connect through face-to-face interactions.
Consider locking your phone or devices away at certain periods throughout the week, such as after 9pm (or within an hour of bedtime for younger children) and seek out opportunities to balance your days with physical activities, such kicking a footy at the park or going on a family bush walk.
Parents can model healthy behaviour by regulating and setting limits on their own screen time. This might mean limiting your social media scrolling to 15 or 30 minutes a day and keeping your phone in the next room when you’re not using it.
When establishing appropriate boundaries and ensuring children’s safety, it is crucial for parents and guardians to engage in open communication about technology use. This includes teaching critical thinking skills to navigate online content safely and employing parental control tools and privacy settings.
Parents can foster a supportive and trusting relationship with children from an early age so children feel comfortable discussing their online experiences and sharing their fears or concerns.
For resources to help you develop your own family’s screen time plan, visit the Raising Children Network.
Elise Waghorn, Lecturer, School of Education, RMIT University
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Don’t Forget…
Did you arrive here from our newsletter? Don’t forget to return to the email to continue learning!
Learn to Age Gracefully
Join the 98k+ American women taking control of their health & aging with our 100% free (and fun!) daily emails:
-
Coca-Cola vs Diet Coke – Which is Healthier?
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
Our Verdict
When comparing Coca-Cola to Diet Coke, we picked the Diet Coke.
Why?
While the Diet Coke is bad, the Coca-Cola has mostly the same problems plus sugar.
The sugar in a can of Coca-Cola is 39g high-fructose corn syrup (the worst kind of sugar yet known to humanity), and of course it’s being delivered in liquid form (the most bioavailable way to get, which in this case, is bad).
To put those 39g into perspective, the daily recommended amount of sugar is 36g for men or 25g for women, according to the AHA.
The sweetener in Diet Coke is aspartame, which has had a lot of health risk accusations made against it, most of which have not stood up to scrutiny, and the main risk it does have is “it mimics sugar too well” and it can increase cravings for sweetness, and therefore higher consumption of sugars in other products. For this reason, the World Health Organization has recommended to simply reduce sugar intake without looking to artificial sweeteners to help.
Nevertheless, aspartame has been found safe (in moderate doses; the upper tolerance level would equate to more than 20 cans of diet coke per day) by food safety agencies ranging from the FDA to the EFSA, based on a large body of science.
Other problems that Diet Coke has are present in Coca-Cola too, such as its acidic nature (bad for tooth enamel) and gassy nature (messes with leptin/ghrelin balance).
Summary: the Diet Coke is relatively less unhealthy, but is still bad in numerous ways, and remains best avoided.
Read more:
Don’t Forget…
Did you arrive here from our newsletter? Don’t forget to return to the email to continue learning!
Learn to Age Gracefully
Join the 98k+ American women taking control of their health & aging with our 100% free (and fun!) daily emails:
-
Sauerkraut vs Pickled Cucumber – Which is Healthier?
10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.
Our Verdict
When comparing sauerkraut to pickled cucumber, we picked the sauerkraut.
Why?
Both of these fermented foods can give a gut-healthy microbiome boost, but how do they stack up otherwise?
In terms of macros, sauerkraut has more protein, carbs, and fiber. They are both low glycemic index foods, so we’ll go with the one that has more fiber out of the two, and that’s the ‘kraut.
In the category of vitamins, sauerkraut has more of vitamins B1, B2, B3, B5, B6, B7, B9, C, E, and choline, while pickled cucumbers have more of vitamins A and K. An easy win for sauerkraut.
When it comes to minerals, sauerkraut has more calcium, copper, iron, magnesium, manganese, phosphorus, potassium, selenium, and zinc, while pickled cucumbers are not higher in any mineral, except sodium (on average, pickled cucumbers have about 2x the sodium of sauerkraut). Another clear win for sauerkraut.
In short, enjoy either or both in moderation, but it’s clear which boasts the most nutritional benefits, and that’s the sauerkraut!
Want to learn more?
You might like to read:
Make Friends With Your Gut (You Can Thank Us Later)
Take care!
Don’t Forget…
Did you arrive here from our newsletter? Don’t forget to return to the email to continue learning!
Learn to Age Gracefully
Join the 98k+ American women taking control of their health & aging with our 100% free (and fun!) daily emails: