Taking A Trip Through The Evidence On Psychedelics

10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.

In Tuesday’s newsletter, we asked you for your opinions on the medicinal use of psychedelics, and got the above-depicted, below-described, set of responses:

  • 32% said “This is a good, evidence-based way to treat many brain disorders”
  • 32% said “There are some benefits, but they don’t outweigh the risks”
  • 20% said “This can help a select few people only; useless for the majority”
  • 16% said “This is hippie hogwash and hearsay; wishful thinking at best”

Quite a spread of answers, so what does the science say?

This is hippie hogwash and hearsay; wishful thinking at best! True or False?

False! We’re tackling this one first, because it’s easiest to answer:

There are some moderately-well established [usually moderate] clinical benefits from some psychedelics for some people.

If that sounds like a very guarded statement, it is. Part of this is because “psychedelics” is an umbrella term; perhaps we should have conducted separate polls for psilocybin, MDMA, ayahuasca, LSD, ibogaine, etc, etc.

In fact: maybe we will do separate main features for some of these, as there is a lot to say about each of them separately.

Nevertheless, looking at the spread of research as it stands for psychedelics as a category, the answers are often similar across the board, even when the benefits/risks may differ from drug to drug.

To speak in broad terms, if we were to make a research summary for each drug it would look approximately like this in each case:

  • there has been research into this, but not nearly enough, as “the war on drugs” may well have manifestly been lost (the winner of the war being: drugs; still around and more plentiful than ever), but it did really cramp science for a few decades.
  • the studies are often small, heterogenous (often using moderately wealthy white student-age population samples), and with a low standard of evidence (i.e. the methodology often has some holes that leave room for reasonable doubt).
  • the benefits recorded are often small and transient.
  • in their favor, though, the risks are also generally recorded as being quite low, assuming proper safe administration*.

*Illustrative example:

Person A takes MDMA in a club, dances their cares away, has had only alcohol to drink, sweats buckets but they don’t care because they love everyone and they see how we’re all one really and it all makes sense to them and then they pass out from heat exhaustion and dehydration and suffer kidney damage (not to mention a head injury when falling) and are hospitalized and could die;

Person B takes MDMA in a lab, is overwhelmed with a sense of joy and the clarity of how their participation in the study is helping humanity; they want to hug the researcher and express their gratitude; the researcher reminds them to drink some water.

Which is not to say that a lab is the only safe manner of administration; there are many possible setups for supervised usage sites. But it does mean that the risks are often as much environmental as they are risks inherent to the drug itself.

Others are more inherent to the drug itself, such as adverse cardiac events for some drugs (ibogaine is one that definitely needs medical supervision, for example).

For those who’d like to see numbers and clinical examples of the bullet points we gave above, here you go; this is a great (and very readable) overview:

NIH | Evidence Brief: Psychedelic Medications for Mental Health and Substance Use Disorders

Notwithstanding the word “brief” (intended in the sense of: briefing), this is not especially brief and is rather an entire book (available for free, right there!), but we do recommend reading it if you have time.

This can help a select few people only; useless for the majority: True or False?

True, technically, insofar as the evidence points to these drugs being useful for such things as depression, anxiety, PTSD, addiction, etc, and estimates of people who struggle with mental health issues in general is often cited as being 1 in 4, or 1 in 5. Of course, many people may just have moderate anxiety, or a transient period of depression, etc; many, meanwhile, have it worth.

In short: there is a very large minority of people who suffer from mental health issues that, for each issue, there may be one or more psychedelic that could help.

This is a good, evidence-based way to treat many brain disorders: True or False?

True if and only if we’re willing to accept the so far weak evidence that we discussed above. False otherwise, while the jury remains out.

One thing in its favor though is that while the evidence is weak, it’s not contradictory, insofar as the large preponderance of evidence says such therapies probably do work (there aren’t many studies that returned negative results); the evidence is just weak.

When a thousand scientists say “we’re not completely sure, but this looks like it helps; we need to do more research”, then it’s good to believe them on all counts—the positivity and the uncertainty.

This is a very different picture than we saw when looking at, say, ear candling or homeopathy (things that the evidence says simply do not work).

We haven’t been linking individual studies so far, because that book we linked above has many, and the number of studies we’d have to list would be:

n = number of kinds of psychedelic drugs x number of conditions to be treated

e.g. how does psilocybin fare for depression, eating disorders, anxiety, addiction, PTSD, this, that, the other; now how does ayahuasca fare for each of those, and so on for each drug and condition; at least 25 or 30 as a baseline number, and we don’t have that room.

But here are a few samples to finish up:

In closing…

The general scientific consensus is presently “many of those drugs may ameliorate many of those conditions, but we need a lot more research before we can say for sure”.

On a practical level, an important take-away from this is twofold:

  • drugs, even those popularly considered recreational, aren’t ontologically evil, generally do have putative merits, and have been subject to a lot of dramatization/sensationalization, especially by the US government in its famous war on drugs.
  • drugs, even those popularly considered beneficial and potentially lifechangingly good, are still capable of doing great harm if mismanaged, so if putting aside “don’t do drugs” as a propaganda of the past, then please do still hold onto “don’t do drugs alone”; trained professional supervision is a must for safety.

Take care!

Don’t Forget…

Did you arrive here from our newsletter? Don’t forget to return to the email to continue learning!

Recommended

  • Antidepressants: Personalization Is Key!
  • The Vagus Nerve (And How You Can Make Use Of It)
    The Vagus Nerve: The Brain-Gut Connection. Learn how to stimulate the vagus nerve for better health through massage, electrostimulation, and diaphragmatic breathing. Watch the video for a demonstration.

Learn to Age Gracefully

Join the 98k+ American women taking control of their health & aging with our 100% free (and fun!) daily emails:

  • Apples vs Carrots – Which is Healthier?

    10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.

    Our Verdict

    When comparing apples to carrots, we picked the carrots.

    Why?

    Both are sweet crunchy snacks, both rightly considered very healthy options, but one comes out clearly on top…

    Both contain lots of antioxidants, albeit mostly different ones. They’re both good for this.

    Looking at their macros, however, apples have more carbs while carrots have more fiber. The carb:fiber ratio in apples is already sufficient to make them very healthy, but carrots do win.

    In the category of vitamins, carrots have many times more of vitamins A, B1, B2, B3, B5, B6, B9, C, E, K, and choline. Apples are not higher in any vitamins.

    In terms of minerals, carrots have a lot more calcium, copper, iron, magnesium, manganese, phosphorus, potassium, selenium, and zinc. Apples are not higher in any minerals.

    If “an apple a day keeps the doctor away”, what might a carrot a day do?

    Want to learn more?

    You might like to read:

    Sugar: From Apples to Bees, and High-Fructose C’s

    Take care!

    Share This Post

  • Which Sugars Are Healthier, And Which Are Just The Same?

    10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.

    From Apples to Bees, and high-fructose C’s

    We asked you for your (health-related) policy on sugar. The trends were as follows:

    • About half of all respondents voted for “I try to limit sugar intake, but struggle because it’s in everything”
    • About a quarter of all respondents voted for “Refined sugar is terrible; natural sugars (e.g. honey, agave) are fine”
    • About a quarter of all respondents voted for “Sugar is sugar and sugar is bad; I avoid it entirely”
    • One (1) respondent voted for “Sugar is an important source of energy, so I consume plenty”

    Writer’s note: I always forget to vote in these, but I’d have voted for “I try to limit sugar intake, but struggle because it’s in everything”.

    Sometimes I would like to make my own [whatever] to not have the sugar, but it takes so much more time, and often money too.

    So while I make most things from scratch (and typically spend about an hour cooking each day), sometimes store-bought is the regretfully practical timesaver/moneysaver (especially when it comes to condiments).

    So, where does the science stand?

    There has, of course, been a lot of research into the health impact of sugar.

    Unfortunately, a lot of it has been funded by sugar companies, which has not helped. Conversely, there are also studies funded by other institutions with other agendas to push, and some of them will seek to make sugar out to be worse than it is.

    So for today’s mythbusting overview, we’ve done our best to quality-control studies for not having financial conflicts of interest. And of course, the usual considerations of favoring high quality studies where possible Large sample sizes, good method, human subjects, that sort of thing.

    Sugar is sugar and sugar is bad: True or False?

    False and True, respectively.

    • Sucrose is sucrose, and is generally bad.
    • Fructose is fructose, and is worse.

    Both ultimately get converted into glycogen (if not used immediately for energy), but for fructose, this happens mostly* in the liver, which a) taxes it b) goes very unregulated by the pancreas, causing potentially dangerous blood sugar spikes.

    This has several interesting effects:

    • Because fructose doesn’t directly affect insulin levels, it doesn’t cause insulin insensitivity (yay)
    • Because fructose doesn’t directly affect insulin levels, this leaves hyperglycemia untreated (oh dear)
    • Because fructose is metabolized by the liver and converted to glycogen which is stored there, it’s one of the main contributors to non-alcoholic fatty liver disease (at this point, we’re retracting our “yay”)

    Read more: Fructose and sugar: a major mediator of non-alcoholic fatty liver disease

    *”Mostly” in the liver being about 80% in the liver. The remaining 20%ish is processed by the kidneys, where it contributes to kidney stones instead. So, still not fabulous.

    Fructose is very bad, so we shouldn’t eat too much fruit: True or False?

    False! Fruit is really not the bad guy here. Fruit is good for you!

    Fruit does contain fructose yes, but not actually that much in the grand scheme of things, and moreover, fruit contains (unless you have done something unnatural to it) plenty of fiber, which mitigates the impact of the fructose.

    • A medium-sized apple (one of the most sugary fruits there is) might contain around 11g of fructose
    • A tablespoon of high-fructose corn syrup can have about 27g of fructose (plus about 3g glucose)

    Read more about it: Effects of high-fructose (90%) corn syrup on plasma glucose, insulin, and C-peptide in non-insulin-dependent diabetes mellitus and normal subjects

    However! The fiber content (in fruit) mitigates the impact of the fructose almost entirely anyway.

    And if you take fruits that are high in sugar and/but high in polyphenols, like berries, they now have a considerable net positive impact on glycemic health:

    You may be wondering: what was that about “unless you have done something unnatural to it”?

    That’s mostly about juicing. Juicing removes much (or all) of the fiber, and if you do that, you’re basically back to shooting fructose into your veins:

    Natural sugars like honey, agave, and maple syrup, are healthier than refined sugars: True or False?

    TrueSometimes, and sometimes marginally.

    This is partly because of the glycemic index and glycemic load. The glycemic index scores tail off thus:

    • table sugar = 65
    • maple syrup = 54
    • honey = 46
    • agave syrup = 15

    So, that’s a big difference there between agave syrup and maple syrup, for example… But it might not matter if you’re using a very small amount, which means it may have a high glycemic index but a low glycemic load.

    Note, incidentally, that table sugar, sucrose, is a disaccharide, and is 50% glucose and 50% fructose.

    The other more marginal health benefits come from that fact that natural sugars are usually found in foods high in other nutrients. Maple syrup is very high in manganese, for example, and also a fair source of other minerals.

    But… Because of its GI, you really don’t want to be relying on it for your nutrients.

    Wait, why is sugar bad again?

    We’ve been covering mostly the more “mythbusting” aspects of different forms of sugar, rather than the less controversial harms it does, but let’s give at least a cursory nod to the health risks of sugar overall:

    That last one, by the way, was a huge systematic review of 37 large longitudinal cohort studies. Results varied depending on what, specifically, was being examined (e.g. total sugar, fructose content, sugary beverages, etc), and gave up to 200% increased cancer risk in some studies on sugary beverages, but 95% increased risk is a respectable example figure to cite here, pertaining to added sugars in foods.

    And finally…

    The 56 Most Common Names for Sugar (Some Are Tricky)

    How many did you know?

    Share This Post

  • Early Bird Or Night Owl? Genes vs Environment

    10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.

    A Sliding Slope?

    In Tuesday’s newsletter, we asked you how much control you believe we have over our sleep schedule, and got the above-depicted, below-described, set of responses:

    • 45% said “most people can control it; some people with sleep disorders cannot
    • 35% said “our genes predispose us to early/late, but we can slide it a bit
    • 15% said: “going against our hardwired sleep schedules is a road to ruin”
    • 5% said “anyone can adjust their sleep schedule with enough willpower”

    You may be wondering: what’s with those single-digit numbers in the graph there? And the answer is: Tuesday’s email didn’t go out at the usual time due to a scheduling mistake (sorry!), which is probably what affected the number of responses (poll response levels vary, but are usually a lot higher than this).

    Note: yes, this does mean most people who read our newsletter don’t vote. So, not to sound like a politician on the campaign trail, but… Your vote counts! We always love reading your comments when you add those, too—often they provide context that allow us to tailor what we focus on in our articles

    However, those are the responses we got, so here we are!

    What does the science say?

    Anyone can adjust their sleep with enough willpower: True or False?

    False, simply. It’s difficult for most people, but for many people with sleep disorders, it is outright impossible.

    In a battle of narcolepsy vs willpower, for example, no amount of willpower will stop the brain from switching to sleep mode when it thinks it’s time to sleep:

    ❝Narcolepsy is the most common neurological cause of chronic sleepiness. The discovery about 20 years ago that narcolepsy is caused by selective loss of the neurons producing orexins sparked great advances in the field

    [There is also] developing evidence that narcolepsy is an autoimmune disorder that may be caused by a T cell-mediated attack on the orexin neurons and explain how these new perspectives can inform better therapeutic approaches.❞

    ~ Dr. Carrie Mahoney et al. (lightly edited for brevity)

    Source: The neurobiological basis of narcolepsy

    For further reading, especially if this applies to you or a loved one:

    Living with Narcolepsy: Current Management Strategies, Future Prospects, and Overlooked Real-Life Concerns

    Our genes predispose us to early/late, but we can slide it a bit: True or False?

    True! First, about our genes predisposing us:

    Genome-wide association analysis of 89,283 individuals identifies genetic variants associated with self-reporting of being a morning person

    …and also:

    Gene distinguishes early birds from night owls and helps predict time of death

    Now, as for the “can slide it a bit”, this is really just a function of the general categories of “early bird” and “night owl” spanning periods of time that allow for a few hours’ wiggle-room at either side.

    However, it is recommended to make any actual changes more gradually, with the Sleep Foundation going so far as to recommend 30 minutes, or even just 15 minutes, of change per day:

    Sleep Foundation | How to Fix Your Sleep Schedule

    Going against our hardwired sleep schedule is a road to ruin: True or False?

    False, contextually. By this we mean: our “hardwired” sleep schedule is (for most of us), genetically predisposed but not predetermined.

    Also, genetic predispositions are not necessarily always good for us; one would not argue, for example, for avoiding going against a genetic predisposition to addiction.

    Some genetic predispositions are just plain bad for us, and genes can be a bit of a lottery.

    That said, we do recommend getting some insider knowledge (literally), by getting personal genomics tests done, if that’s a viable option for you, so you know what’s really a genetic trait (and what to do with that information) and what’s probably caused by something else (and what to do with that information):

    Genetic Testing: Health Benefits & Methods

    Take care!

    Share This Post

Related Posts

  • Antidepressants: Personalization Is Key!
  • The “Yes I Can” Salad

    10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.

    Sometimes, we are given to ask ourselves: “Can I produce a healthy and tasty salad out of what I have in?” and today we show how, with a well-stocked pantry, the answer is “yes I can”, regardless of what is (or isn’t) in the fridge.

    You will need

    • 1 can cannellini beans, drained
    • 1 can sardines (if vegetarian/vegan, substitute ½ can chickpeas, drained)
    • 1 can mandarin segments
    • 1 handful pitted black olives, from a jar (or from a can, if you want to keep the “yes I can” theme going)
    • ½ red onion, thinly sliced (this can be from frozen, defrosted—sliced/chopped onion is always a good thing to have in your freezer, by the way; your writer here always has 1–6 lbs of chopped onions in hers, divided into 1lb bags)
    • 1 oz lemon juice
    • 1 tbsp chopped parsley (this can be freeze-dried, but fresh is good if you have it)
    • 1 tbsp extra virgin olive oil
    • 1 tbsp chia seeds
    • 1 tsp miso paste
    • 1 tsp honey (omit if you don’t care for sweetness; substitute with agave nectar if you do like sweetness but don’t want to use honey specifically)
    • 1 tsp red chili flakes

    Method

    (we suggest you read everything at least once before doing anything)

    1) Combine the onion and the lemon juice in a small bowl, massaging gently

    2) Mix (in another bowl) the miso paste with the chili flakes, chia seeds, honey, olive oil, and the spare juice from the can of mandarin segments, and whisk it to make a dressing.

    3) Add the cannellini beans, sardines (break them into bite-size chunks), mandarin segments, olives, and parsley, tossing them thoroughly (but gently) in the dressing.

    4) Top with the sliced onion, discarding the excess lemon juice, and serve:

    Enjoy!

    Want to learn more?

    For those interested in some of the science of what we have going on today:

    Take care!

    Don’t Forget…

    Did you arrive here from our newsletter? Don’t forget to return to the email to continue learning!

    Learn to Age Gracefully

    Join the 98k+ American women taking control of their health & aging with our 100% free (and fun!) daily emails:

  • I’m iron deficient. Which supplements will work best for me and how should I take them?

    10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.

    Iron deficiency is common and can be debilitating. It mainly affects women. One in three premenopausal women are low in iron compared to just 5% of Australian men. Iron deficiency particularly affects teenage girls, women who do a lot of exercise and those who are pregnant.

    The body needs iron to make new red blood cells, and to support energy production, the immune system and cognitive function. If you’re low, you may experience a range of symptoms including fatigue, weakness, shortness of breath, headache, irregular heartbeat and reduced concentration.

    If a blood test shows you’re iron deficient, your doctor may recommend you start taking an oral iron supplement. But should you take a tablet or a liquid? With food or not? And when is the best time of day?

    Here are some tips to help you work out how, when and what iron supplement to take.

    LittlePigPower/Shutterstock

    How do I pick the right iron supplement?

    The iron in your body is called “elemental iron”. Choosing the right oral supplement and dose will depend on how much elemental iron it has – your doctor will advise exactly how much you need.

    The sweet spot is between 60-120 mg of elemental iron. Any less and the supplement won’t be effective in topping up your iron levels. Any higher and you risk gastrointestinal symptoms such as diarrhoea, cramping and stomach pain.

    Three pregnant bellies during a stretching class.
    Low iron can especially affect people during pregnancy and women who do a lot of sport. Kamil Macniak/Shutterstock

    In Australia, iron salts are the most common oral supplements because they are cheap, effective and come in different delivery methods (tablets, capsules, liquid formulas). The iron salts you are most likely to find in your local chemist are ferrous sulfate (~20% elemental iron), ferrous gluconate (~12%) and ferrous fumarate (~33%).

    These formulations all work similarly, so your choice should come down to dose and cost.

    Many multivitamins may look like an iron supplement, but it’s important to note they usually have too little iron – usually less than 20 mg – to correct an iron deficiency.

    Should I take tablets or liquid formulas?

    Iron contained within a tablet is just as well absorbed as iron found in a liquid supplement. Choosing the right one usually comes down to personal preference.

    The main difference is that liquid formulas tend to contain less iron than tablets. That means you might need to take more of the product to get the right dose, so using a liquid supplement could work out to be more expensive in the long term.

    What should I eat with my iron supplement?

    Research has shown you will absorb more of the iron in your supplement if you take it on an empty stomach. But this can cause more gastrointestinal issues, so might not be practical for everyone.

    If you do take your supplement with meals, it’s important to think about what types of food will boost – rather than limit – iron absorption. For example, taking the supplement alongside vitamin C improves your body’s ability to absorb it.

    Some supplements already contain vitamin C. Otherwise you could take the supplement along with a glass of orange juice, or other vitamin C-rich foods.

    A woman pours orange juice into a glass next to a bowl of strawberries and kiwifruit.
    Taking your supplement alongside foods rich in vitamin C, like orange juice or kiwifruit, can help your body absorb the iron. Anete Lusina/Pexels

    On the other hand, tea, coffee and calcium all decrease the body’s ability to absorb iron. So you should try to limit these close to the time you take your supplement.

    Should I take my supplement in the morning or evening?

    The best time of day to take your supplement is in the morning. The body can absorb significantly more iron earlier in the day, when concentrations of hepcidin (the main hormone that regulates iron) are at their lowest.

    Exercise also affects the hormone that regulates iron. That means taking your iron supplement after exercising can limit your ability to absorb it. Taking your supplement in the hours following exercise will mean significantly poorer absorption, especially if you take it between two and five hours after you stop.

    Our research has shown if you exercise every day, the best time to take your supplement is in the morning before training, or immediately after (within 30 minutes).

    My supplements are upsetting my stomach. What should I do?

    If you experience gastrointestinal side effects such as diarrhoea or cramps when you take iron supplements, you may want to consider taking your supplement every second day, rather than daily.

    Taking a supplement every day is still the fastest way to restore your iron levels. But a recent study has shown taking the same total dose can be just as effective when it’s taken on alternate days. For example, taking a supplement every day for three months works as well as every second day for six months. This results in fewer side effects.

    Oral iron supplements can be a cheap and easy way to correct an iron deficiency. But ensuring you are taking the right product, under the right conditions, is crucial for their success.

    It’s also important to check your iron levels prior to commencing iron supplementation and do so only under medical advice. In large amounts, iron can be toxic, so you don’t want to be consuming additional iron if your body doesn’t need it.

    If you think you may be low on iron, talk to your GP to find out your best options.

    Alannah McKay, Postdoctoral Research Fellow, Sports Nutrition, Australian Catholic University

    This article is republished from The Conversation under a Creative Commons license. Read the original article.

    Don’t Forget…

    Did you arrive here from our newsletter? Don’t forget to return to the email to continue learning!

    Learn to Age Gracefully

    Join the 98k+ American women taking control of their health & aging with our 100% free (and fun!) daily emails:

  • Wholewheat Bread vs Seeded White – Which is Healthier?

    10almonds is reader-supported. We may, at no cost to you, receive a portion of sales if you purchase a product through a link in this article.

    Our Verdict

    When comparing wholewheat bread to seeded bread, we picked the wholewheat.

    Why?

    First, we will acknowledge that this is a false dichotomy; it is possible to have seeded wholewheat bread. However, it is very common to have wholewheat bread that isn’t seeded, and white bread that is seeded. So, it’s important to be able to decide which is the healthier option, since very often, this false dichotomy is what’s on offer.

    We will also advise checking labels (or the baker, if getting from a bakery) to ensure that visibly brown bread is actually wholewheat, and not just dyed brown with caramel coloring or such (yes, that is a thing that some companies do).

    Now, as for why we chose the wholewheat over the seeded white…

    In terms of macronutrients, wholewheat bread has (on average; individual breads may vary of course) has 2x the protein and a lot more fiber.

    Those seeds in seeded bread? They just aren’t enough to make a big impact on the overall nutritional value of the bread in those regards. Per slice, you are getting, what, 10 seeds maybe? This is not a meaningful dietary source of much.

    Seeded bread does have proportionally more healthy fats, but the doses are still so low as to make it not worth the while; it just looks like a lot of expressed as a percentage of comparison, because of the wholewheat bread has trace amounts, and the seeded bread has several times those trace amounts, it’s still a tiny amount. So, we’d recommend looking to other sources for those healthy fats.

    Maybe dip your bread, of whatever kind, into extra virgin olive oil, for example.

    Wholewheat bread of course also has a lower glycemic index. Those seeds in seeded white bread don’t really slow it down at all, because they’re not digested until later.

    Want to learn more?

    You might like to read:

    Enjoy!

    Don’t Forget…

    Did you arrive here from our newsletter? Don’t forget to return to the email to continue learning!

    Learn to Age Gracefully

    Join the 98k+ American women taking control of their health & aging with our 100% free (and fun!) daily emails: