• The database table for Search and "New Messages" functionality is currently being rebuilt. Until it is completed, you may see empty or incomplete results for these.
  • Psychedelic Medicine
  • Psychedelic Medicine Moderator: mr peabody
  • Bluelight HOT THREADS
  • Let's Welcome Our NEW MEMBERS!

NEUROSCIENCE | He couldn't speak. Implants turned his thoughts into words. ->


Moderator: DiTM
Staff member
Aug 16, 2019
your dad's house
Good luck on your finals.

Don't get psyched out by the math in papers you read. A lot of the time it's there to support verbal conclusions (showing the work). Glean what you can and move on.
Thanks man

I guess that would be better phrased as I would like to learn the math behind it. I’ve always had a bit of bent for theory

mr peabody

Moderator: Music Discussion, PM
Staff member
Aug 31, 2016
Frostbite Falls, MN

Psychedelics help patients restore cognitive abilities

by Tim Hinchliffe | The Sociable | 19 May 2020

With over 25 years in the “brain preservation business,” Dr. Roger McIntyre is one of the world’s most recognized experts in mood disorders.

He is also the CEO of Champignon Brands — the only clinic in Canada to perform psilocybin dosing, with Health Canada approval.

“We now think depression is a disconnection syndrome no different than your PC and your motherboard. There’s something disconnected.”

Dr. McIntyre tells The Sociable that mood disorders are the great enemy of the state because they debase human capital by reducing cognitive function.

“The reduction in human GDP is coming from cognition, and that’s what took me towards very novel treatments like ketamine and psychedelics because we really need to find treatments that can try to improve people’s cognitive abilities and reduce their cognitive disabilities,” said Dr. McIntyre.

When we think of lowered cognitive capabilities, we may assume that there’s a lowering of IQ, but Dr. McIntyre assures that IQ does not go down with mood disorders like depression, but rather, circuit connectivity is disrupted in the brain, and that psychedelics can provide that proverbial Crtl, Alt, Delete.

“We now think depression is a disconnection syndrome no different than your PC and your motherboard. There’s something disconnected,” he says.

But it’s not just depression that Dr. McIntyre describes as a “disconnection syndrome.” He says that most mood disorders stem from the same source problem — a disconnect in the brain circuitry — something wrong with the motherboard, what he calls “CNN, or Circuits, Nodes, and Networks.”

“We now think that if you have autism, schizophrenia, anorexia nervosa, depression, bipolar, dementia, Alzheimer’s — the convergent view now is that although those are not the same disease states, they share something in common — there’s something wrong with the motherboard,” he said.

“Just imagine your motherboard on your PC. You go to turn it on one morning and it’s not turning on, or it’s really slow. That, metaphorically, is what depression does to your brain. There’s something wrong with the circuit connectivity.”

This is where metaphor and reality blend. Thanks to technology in the lab, specialists like Dr. McIntyre can actually see which parts of the brain are lighting up and which ones are disconnected in real-time, just like peeking inside a computer.

“And it’s not just a metaphor,” he says. “When we do experiments in the lab using MRI and looking at the brain, we are able to tease out the networks in the brain. It’s really incredible, you can actually see the networks live in a real living, breathing brain.”

“It’s really incredible, you can actually see the networks live in a real living, breathing brain.”

If the brain is like a slow or poorly-operating computer, then psychedelics work like fast-acting defrag tools and antivirus software that help reset and rewire the motherboard brain, so it can think clearly again.

And thinking clearly can have a profoundly positive impact on mood.

In successful treatment cases, the differences can be like night and day.

As one of Dr. McIntyre’s patient’s described it, “My brain was like a Commodore 64 while I was depressed; now it’s like a Pentium, zapped-up, turbo-charged brain again.”

When someone has a mood disorder, it doesn’t mean that they are any less smart, it just means that their circuit connectivity is off and needs to be reset.

“The cool thing about ketamine and psychedelics,” according to Dr. McIntyre, "is the brain resets very quickly, like after one or two doses."

And it’s not just a reset of a certain area of the brain; it’s a reset of the entire brain — the full effects of which have yet to be entirely understood.

A person can take an antidepressant, which can reset their mood, but it won’t reset their cognitive functionality.

“While antidepressants of the conventional variety take longer to work, they are more localized. They don’t help certain circuits like cognition, for example,” said Dr. McIntyre.

Psychedelic treatments, on the other hand, can reset the mood, the cognitive functionality, and a whole lot more.

If mood disorders can be classified as disconnection syndromes, then psychedelics look to be the software that can reconnect the circuitry.

Last edited:

mr peabody

Moderator: Music Discussion, PM
Staff member
Aug 31, 2016
Frostbite Falls, MN

So what do new neurons in the brains of adults actually do?

by Ashley Yeager | The Scientist | 1 May 2020

Adult neurogenesis, already appreciated for its role in learning and memory, also participates in mental health and possibly even attention, new research suggests.

In the spring of 2019, neuroscientist Heather Cameron set up a simple experiment. She and her colleagues put an adult rat in the middle of a plastic box with a water bottle at one end. They waited until the rat started drinking and then made a startling noise to see how the animal would respond. The team did this repeatedly with regular rats and with animals that were genetically altered so that they couldn’t make new neurons in their hippocampuses, a brain region involved in learning and memory. When the animals heard the noise, those that could make new hippocampal neurons immediately stopped slurping water and looked around, but the animals lacking hippocampal neurogenesis kept drinking. When the team ran the experiment without the water bottle, both sets of rats looked around right away to figure out where the sound was coming from. Rats that couldn’t make new neurons seemed to have trouble shifting their attention from one task to another, the researchers concluded.

“It’s a very surprising result,” says Cameron, who works at the National Institute of Mental Health (NIMH) in Bethesda, Maryland. Researchers studying neurogenesis in the adult hippocampus typically conduct experiments in which animals have had extensive training in a task, such as in a water maze, or have experienced repetitive foot shocks, she explains. In her experiments, the rats were just drinking water. “It seemed like there would be no reason that the hippocampus should have any role,” she says. "Yet in animals engineered to lack hippocampal neurogenesis, the effects are pretty big.”

The study joins a growing body of work that challenges the decades-old notion that the primary role of new neurons within the adult hippocampus is in learning and memory. More recently, experiments have tied neurogenesis to forgetting, one possible way to ensure the brain doesn’t become overloaded with information it doesn’t need, and to anxiety, depression, stress, and, as Cameron’s work suggests, attention. Now, neuro-scientists are rethinking the role that new neurons, and the hippocampus as a whole, play in the brain.
Related health conditions
Most of the research into neurogenesis involves boosting or inhibiting animals’ generation of new neurons, then training animals on a complex memory task such as finding a treat in a maze, and later retesting the animals. Decreasing neurogenesis tends to hamper the animals’ ability to remember.​
Alzheimer’s disease, Parkinson’s disease​
Training mice or rats on a memory task before manipulating neurogenesis has also been found to affect the strength of the trained memory. Boosting neurogenesis reduced the memory’s strength, perhaps an extreme form of forgetting that at normal levels avoids the remembering of unnecessary details.​
Alzheimer’s disease and other forms of dementia​
Research has linked decreased neurogenesis with more anxious and depressive behaviors in mice. Stress can reduce neurogenesis, ultimately leading mice to be more anxious in future stressful situations.​
PTSD, anxiety, depression​
Research has linked decreased neurogenesis with trouble switching focus.​

The memory link

The first hint that adult animal brains may make new neurons appeared in the early 1960s, when MIT neurobiologist Joseph Altman used radioactive labeling to track the proliferation of nerve cells in adult rats brains. Other data published in the 1970s and 1980s supported the conclusion, and in the 1990s, Fred “Rusty” Gage and his colleagues at the Salk Institute in La Jolla, California, used an artificial nucleotide called bromodeoxyuridine (BrdU) to tag new neurons born in the brains of adult rats and humans. Around the same time, Elizabeth Gould of Princeton University and her collaborators showed that adult marmoset monkeys made new neurons in their hippocampuses, specifically in an area called the dentate gyrus. While some researchers questioned the strength of the evidence supporting the existence of adult neurogenesis, most of the field began to shift from studying whether adult animal brains make new neurons to what role those cells might play.

In 2011, René Hen at Columbia University and colleagues created a line of transgenic mice in which neurons generated by neuro-genesis survived longer than in wildtype mice. This boosted the overall numbers of new neurons in the animals’ brains. The team then tested the modified mice’s cognitive abilities. Boosting numbers of newly born neurons didn’t improve the mice’s performances in water mazes or avoidance tasks compared with control mice. But it did seem to help them distinguish between two events that were extremely similar. Mice with more new neurons didn’t freeze as long as normal mice when put into a box that was similar to but not exactly the same as one in which they’d experienced a foot shock in earlier training runs.

These results dovetailed with others coming out at the time, particularly those showing that aging humans, in whom neurogenesis is thought to decline, often have trouble remembering details that distinguish similar experiences, what researchers call pattern separation. “The line of thinking is that the memories that are most likely to be impacted by neurogenesis are memories that are really similar to each other,” says Sarah Parylak, a staff scientist in Gage’s lab at the Salk Institute.

As insights into pattern separation emerged, scientists were beginning to track the integration of new rodent neurons into existing neural networks. This research showed that new neurons born in the dentate gyrus had to compete with mature neurons for connections to neurons in the entorhinal cortex (EC), a region of the brain with widespread neural networks that play roles in memory, navigation, and the perception of time. Based on detailed anatomical images, new dentate gyrus neurons in rodents appeared to tap into preexisting synapses between dentate gyrus neurons and EC neurons before creating their own links to EC neurons.

To continue exploring the relationship between old and new neurons, a group led by the Harvard Stem Cell Institute’s Amar Sahay, who had worked with Hen on the team’s 2011 study, wiped out synapses in the dentate gyruses of mice. The researchers overexpressed the cell death–inducing protein Krüppel-like factor 9 in young adult, middle-aged, and old mice to destroy neuronal dendritic spines, tiny protrusions that link up to protrusions of other neurons, in the brain region. Those lost connections led to increased integration of newly made neurons, especially in the two older groups, which outperformed age-matched, untreated mice in pattern-separation tasks. Adult-born dentate gyrus neurons decrease the likelihood of reactivation of those old neurons, Sahay and colleagues concluded, preventing the memories from being confused.

Parylak compares this situation to going to the same restaurant after it has changed ownership. In her neighborhood in San Diego, there’s one location where she’s dined a few times when the restaurant was serving different cuisine. "It’s the same location, and the building retains many of the same features, so the experiences would be easy to mix up,” she says, but she can tell them apart, possibly because of neurogenesis’s role in pattern separation. This might even hold true for going to the same restaurant on different occasions, even if it served the same food.

That’s still speculative at this point. Researchers haven’t been able to watch neurogenesis in action in a living human brain, and it’s not at all clear if the same thing is going on there as in the mouse brains they have observed. While many scientists now agree that neurogenesis does occur in adult human brains, there is little consensus about what it actually does. In addition to the work supporting a role for new neurons in pattern separation, researchers have accumulated evidence that it may be more important for forgetting than it is for remembering.

How adult-born neurons integrate into the brain

In recent years, images and videos taken with state-of-the-art microscopy techniques have shown that new neurons in the dentate gyrus of the hippocampus go through a series of changes as they link up to existing networks in the brain.

A neural stem cell divides to generate a new neuron (green).

As the new neuron grows, it rotates from a horizontal to a vertical position and connects to an interneuron (yellow) in a space called the hilus that sits within the curve of the dentate gyrus. The young neuron also starts making connections with well-established dentate gyrus neurons (blue) as well as neurons in the hippocampus (red).

As the new neuron grows, it rotates from a horizontal to a vertical position and connects to an interneuron (yellow) in a space called the hilus that sits within the curve of the dentate gyrus. The young neuron also starts making connections with well-established dentate gyrus neurons (blue) as well as neurons in the hippocampus (red).

Once connections are formed, mature neurons send signals into the new neuron, and the cell starts firing off more of its own signals. At around four weeks of age, the adult-born neuron gets hyperexcited, sending electrical signals much more often than its well-established neuronal neighbors do.

As the new neuron connects with still more neurons, interneurons in the hilus
start to send it signals to tamp down its activity.

The importance of forgetting

It seems counterintuitive for neurogenesis to play a role in both remembering and forgetting, but work by Paul Frankland of the Hospital for Sick Children Research Institute in Toronto suggests it is possible. In 2014, his team showed that when mice made more new neurons than normal, they were more forgetful. He and his colleagues had mice run on wheels to boost levels of neurogenesis, then trained the animals on a learning task. As expected, they did better than control mice who hadn’t exercised. In other animals, the researchers boosted neurogenesis after the mice learned information thought to be stored, at least in the short term, in the hippocampus. “When we did that, what we found was quite surprising,” Frankland says. “We found a big reduction in memory strength.”

His team was puzzled by the result. Adding to the confusion, the researchers had observed a larger effect in memory impairment with mice that learned, then exercised, than they had seen in memory improvement when the mice ran first and then learned. As he dug into the literature, Frankland realized the effect was what other neuroscientists had called forgetting. He found many theoretical papers based on computational modeling that argued that as new neurons integrate into a circuit, the patterns of connections in the circuit change, and if information is stored in those patterns of connections, that information may be lost.

The notion surprised other neuroscientists, mainly because up to that point they’d had two assumptions related to neurogenesis and forgetting. "The first was that generating new neurons in a normal animal should be good for memory. The second was that forgetting was bad. The first assumption is still true," Frankland says, but the second is not. “Many people think of forgetting as some sort of failure in our memory systems,” he explains. Yet in healthy brains there’s tons of forgetting happening all of the time. “And, in fact, it’s important for memory function,” Frankland says. “It would actually be disadvantageous to remember everything we do.”

Parylak says this idea of forgetting “certainly has provoked a lot of discussion.” It’s unclear, for example, whether the mice in Frankland’s experiments are forgetting, or if they are identifying a repeat event as something novel. "This is the point," she explains, "where doing neurogenesis research in humans would be beneficial. You could ask a person if they’d actually forgotten or if they are making some kind of extreme discrimination.”

Despite the questions regarding the results, Frankland and his colleagues continued their work, testing mice’s forgetfulness with all types of memories, and more recently they asked whether the forgetting effect jeopardized old and new memories alike. In experiments, his team gave mice a foot shock, then boosted hippocampal neurogenesis (with exercise or a genetic tweak to neural progenitor cells), and put the mice in the same container they’d been shocked in. With another group of mice, the researchers waited nearly a month after the foot shock before boosting neurogenesis and putting the mice back in the container. Boosting the number of new neurons, the team found, only weakened the newly made memory, but not one that had been around for a while. “This makes a lot of sense,” Frankland says. “As our memories of everyday events gradually get consolidated, they become less and less dependent on the hippocampus,” and more dependent on another brain region: the cortex. This suggests that remote memories are less sensitive to changes in hippocampal neurogenesis levels.

"The hippocampus tracks what’s happened to you," Frankland says. “Much of that’s forgotten because much of it is inconsequential. But every now and then something interesting seems to happen,” and it’s these eventful memories that seem to get “backed up” in other areas of the brain.

How adult-born neurons function in a circuit

Researchers think neurogenesis helps the brain distinguish between two very similar objects or events, a phenomenon called pattern separation. According to one hypothesis, new neurons’ excitability in response to novel objects diminishes the response of established neurons in the dentate gyrus to incoming stimuli, helping to create a separate circuit for the new, but similar, memory.

Beyond memory

At NIMH, one of Cameron’s first studies looking at the effects of neurogenesis tested the relationship between new neuronal growth and stress. She uncovered the connection studying mice that couldn’t make new neurons and recording how they behaved in an open environment with food at the center. Just like mice that could still make new neurons, the neuro-genesis-deficient mice were hesitant to go get the food in the open space, but eventually they did. However, when the animals that couldn’t make new neurons were stressed before being put into the open space, they were extremely cautious and anxious, whereas normal mice didn’t behave any differently when stressed.

Cameron realized that the generation of new neurons also plays a role in the brain separate from the learning and memory functions for which there was growing evidence. In her experiments, “we were looking for memory effects and looked for quite a while without finding anything and then stumbled onto this stress effect,” she says.

The cells in the hippocampus are densely packed with receptors for stress hormones. One type of hormone in particular, glucocorticoids, is thought to inhibit neurogenesis, and decreased neurogenesis has been associated with depression and anxiety behaviors in rodents. But there wasn’t a direct link between the experience of stress and the development of these behaviors. So Cameron and her colleagues set up an experiment to test the connection.

When the team blocked neurogenesis in adult mice and then restrained the animals to moderately stress them, their elevated glucocorticoid levels were slow to recover compared with mice that had normal neurogenesis. "The stressed mice that could not generate new neurons also acted oddly in behavioral tests: they avoided food when put in a new environment, became immobile and increasingly distressed when forced to swim, and drank less sugary water than normal mice when it was offered to them, suggesting they don’t work as hard as normal mice to experience pleasure. Impaired adult neurogenesis, the experiments showed, played a direct role in developing symptoms of depression," Cameron says.

The notion that neurogenesis and stress might be tied directly to our mental states led Cameron to look back into the literature, where she found many suggestions that the hippocampus plays a role in emotion, in addition to learning and memory. Even Altman, who unexpectedly identified neurogenesis in adult rodents in the 1960s, and colleagues suggested as much in the 1970s. Yet the argument has only appeared sporadically in the literature since then. “Stress is complicated,” Cameron says; "It’s hard to know exactly how stressful experiences affect neurogenesis or how the generation of new neurons will influence an animal’s response to stress. Some types of stress can decrease neurogenesis while others, such as certain forms of intermittent stress, can increase new neuronal growth." Last year, Cameron and colleagues found that generating new neurons helps rats used to model post-traumatic stress disorder recover from acute and prolonged periods of stress.

Her work has also linked neurogenesis to other characteristics of rodent behavior, including attention and sociability. In 2016, with Gould at Princeton and a few other collaborators, she published work suggesting that new neurons are indeed tied to social behavior. The team created a hierarchy among rats, and then deconstructed those social ranks by removing the dominant male. When the researchers sacrificed the animals and counted new neurons in their brains, the rats from deconstructed hierarchies had fewer new neurons than those from control cages with stable ranks. Rats with uncertain hierarchies and fewer new neurons didn’t show any signs of anxiety or reduced cognition, but they weren’t as inclined as control animals to spend time with new rats put into their quarters, preferring to stick with the animals they knew. When given a drug—oxytocin—to boost neurogenesis, they once again began exploring and spending time with new rats that entered their cages.

The study from Cameron’s lab on rats’ ability to shift their attention grew out of the researchers’ work on stress, in which they observed that rodents sometimes couldn’t switch from one task to the next. Turning again to the literature, Cameron found a study from 1969 that seemed to suggest that neurogenesis might affect this task-switching behavior. Her team set up the water bottle experiments to see how well rats shifted attention. Inhibiting neurogenesis in the adult mice led to a 50 percent decrease in their ability to switch their focus from drinking to searching for the source of the sound.

“This paper is very interesting,” says J. Tiago Gonçalves, a neuroscientist at Albert Einstein College of Medicine in New York who studies neurogenesis but was not involved in the study. "It could explain the findings seen in some behavioral tasks and the incongruences between findings from different behavioral tasks, he writes in an email to The Scientist. Of course, follow-up work is needed," he adds.

Cameron argues that shifting attention may be yet another behavior in which the hippocampus plays an essential role but that researchers have been overlooking. And there may be an unexplored link between making new neurons and autism or other attention disorders, she says. Children with autism often have trouble shifting their attention from one image to the next in behavioral tests unless the original image is removed.

It’s becoming clear, Cameron continues, that neurogenesis has many functions in the adult brain, some that are very distinct from learning and memory. In tasks requiring attention, though, there is a tie to memory, she notes. “If you’re not paying attention to things, you will not remember them.”

Do new neurons appear anywhere else in the brain?

Many, though not all, neuroscientists agree that there’s ongoing neurogenesis in the hippocampus of most mammals, including humans. In rodents and many other animals, neurogenesis has also been observed in the olfactory bulbs. Whether newly generated neurons show up anywhere else in the brain is more controversial.

There had been hints of new neurons showing up in the striatum of primates in the early 2000s. In 2005, Heather Cameron of the National Institute of Mental Health and colleagues corroborated those findings, showing evidence of newly made neurons in the rat neocortex, a region of the brain involved in spatial reasoning, language, movement, and cognition, and in the striatum, a region of the brain involved in planning movements and reacting to rewards, as well as self-control and flexible thinking. Nearly a decade later, using nuclear-bomb-test-derived carbon-14 isotopes to identify when nerve cells were born, Jonas Frisén of the Karolinska Institute in Stockholm and colleagues examined the brains of postmortem adult humans and confirmed that new neurons existed in the striatum.

“Those results are great,” Cameron says. They support her idea that there are different types of neurons being born in the brain throughout life. “The problem is they’re very small cells, they’re very scattered, and there’re very few of them. So they’re very tough to see and very tough to study.”

Last edited:

mr peabody

Moderator: Music Discussion, PM
Staff member
Aug 31, 2016
Frostbite Falls, MN

Psychedelics Promote Structural and Functional Neural Plasticity

Calvin Ly, Alexandra C. Greb, Lindsay P. Cameron, Jonathan M. Wong, Eden V. Barragan, Paige C. Wilson, Kyle F. Burbach, Sina Soltanzadeh Zarandi, Alexander Sood, Michael R. Paddy, Whitney C. Duim, Megan Y. Dennis, A. Kimberley McAllister, Kassandra M. Ori-McKenney, John A. Gray, David E. Olson

Atrophy of neurons in the prefrontal cortex (PFC) plays a key role in the pathophysiology of depression and related disorders. The ability to promote both structural and functional plasticity in the PFC has been hypothesized to underlie the fast-acting antidepressant properties of the dissociative anesthetic ketamine. Here, we report that, like ketamine, serotonergic psychedelics are capable of robustly increasing neuritogenesis and/or spinogenesis both in vitro and in vivo. These changes in neuronal structure are accompanied by increased synapse number and function, as measured by fluorescence microscopy and electrophysiology. The structural changes induced by psychedelics appear to result from stimulation of the TrkB, mTOR, and 5-HT2A signaling pathways and could possibly explain the clinical effectiveness of these compounds. Our results underscore the therapeutic potential of psychedelics and, importantly, identify several lead scaffolds for medicinal chemistry efforts focused on developing plasticity-promoting compounds as safe, effective, and fast-acting treatments for depression and related disorders.


Classical serotonergic psychedelics are known to cause changes in mood and brain function that persist long after the acute effects of the drugs have subsided. Moreover, several psychedelics elevate glutamate levels in the cortex and increase gene expression in vivo of the neurotrophin BDNF as well as immediate-early genes associated with plasticity. This indirect evidence has led to the reasonable hypothesis that psychedelics promote structural and functional neural plasticity, although this assumption had never been rigorously tested. The data presented here provide direct evidence for this hypothesis, demonstrating that psychedelics cause both structural and functional changes in cortical neurons.

Prior to this study, two reports suggested that psychedelics might be able to produce changes in neuronal structure. Jones et al. (2009) demonstrated that DOI was capable of transiently increasing the size of dendritic spines on cortical neurons, but no change in spine density was observed. The second study showed that DOI promoted neurite extension in a cell line of neuronal lineage. Both of these reports utilized DOI, a psychedelic of the amphetamine class. Here we demonstrate that the ability to change neuronal structure is not a unique property of amphetamines like DOI because psychedelics from the ergoline, tryptamine, and iboga classes of compounds also promote structural plasticity. Additionally, D-amphetamine does not increase the complexity of cortical dendritic arbors in culture, and therefore, these morphological changes cannot be simply attributed to an increase in monoamine neurotransmission.

The identification of psychoplastogens belonging to distinct chemical families is an important aspect of this work because it suggests that ketamine is not unique in its ability to promote structural and functional plasticity. In addition to ketamine, the prototypical psychoplastogen, only a relatively small number of plasticity-promoting small molecules have been identified previously. We observe that hallucinogens from four distinct structural classes (i.e., tryptamine, amphetamine, ergoline, and iboga) are also potent psychoplastogens, providing additional lead scaffolds for medicinal chemistry efforts aimed at identifying neurotherapeutics. Furthermore, our cellular assays revealed that several of these compounds were more efficacious (e.g., MDMA) or more potent (e.g., LSD) than ketamine. In fact, the plasticity-promoting properties of psychedelics and entactogens rivaled that of BDNF. The extreme potency of LSD in particular might be due to slow off kinetics, as recently proposed following the disclosure of the LSD-bound 5-HT2B crystal structure.

Importantly, the psychoplastogenic effects of psychedelics in cortical cultures were also observed in vivo using both vertebrate and invertebrate models, demonstrating that they act through an evolutionarily conserved mechanism. Furthermore, the concentrations of psychedelics utilized in our in vitro cell culture assays were consistent with those reached in the brain following systemic administration of therapeutic doses in rodents. This suggests that neuritogenesis, spinogenesis, and/or synaptogenesis assays performed using cortical cultures might have value for identifying psychoplastogens and fast-acting antidepressants. It should be noted that our structural plasticity studies performed in vitro utilized neurons exposed to psychedelics for extended periods of time. Because brain exposure to these compounds is often of short duration due to rapid metabolism, it will be interesting to assess the kinetics of psychedelic-induced plasticity.

A key question in the field of psychedelic medicine has been whether or not psychedelics promote changes in the density of dendritic spines. Using super-resolution SIM, we clearly demonstrate that psychedelics do, in fact, increase the density of dendritic spines on cortical neurons, an effect that is not restricted to a particular structural class of compounds. Using DMT, we verified that cortical neuron spine density increases in vivo and that these changes in structural plasticity are accompanied by functional effects such as increased amplitude and frequency of spontaneous EPSCs. We specifically designed these experiments to mimic previous studies of ketamine so that we might directly compare these two compounds, and, to a first approximation, they appear to be remarkably similar. Not only do they both increase spine density and neuronal excitability in the cortex, they seem to have similar behavioral effects. We have shown previously that, like ketamine, DMT promotes fear extinction learning and has antidepressant effects in the forced swim test. These results, coupled with the fact that ayahuasca, a DMT-containing concoction, has potent antidepressant effects in humans, suggest that classical psychedelics and ketamine might share a related therapeutic mechanism.

Although the molecular targets of ketamine and psychedelics are different (NMDA and 5-HT2A receptors, respectively), they appear to cause similar downstream effects on structural plasticity by activating mTOR. This finding is significant because ketamine is known to be addictive whereas many classical psychedelics are not. The exact mechanisms by which these compounds stimulate mTOR is still not entirely understood, but our data suggest that, at least for classical psychedelics, TrkB and 5-HT2A receptors are involved. Although most classical psychedelics are not considered to be addictive, there are still significant safety concerns with their use in medicine because they cause profound perceptual disturbances and still have the potential to be abused. Therefore, the identification of non-psychedelic analogs capable of promoting plasticity in the PFC could facilitate a paradigm shift in our approach to treating neuropsychiatric diseases. Moreover, such compounds could be critical to resolving the long-standing debate in the field concerning whether the subjective effects of psychedelics are necessary for their therapeutic effects. Although our group is actively investigating the psychoplastogenic properties of non-psychedelic analogs, others have reported the therapeutic potential of safer structural and functional analogs of ketamine.

Our data demonstrate that classical psychedelics from several distinct chemical classes are capable of robustly promoting the growth of both neurites and dendritic spines in vitro, in vivo, and across species. Importantly, our studies highlight the similarities between the effects of ketamine and those of classical serotonergic psychedelics, supporting the hypothesis that the clinical antidepressant and anxiolytic effects of these molecules might result from their ability to promote structural and functional plasticity in prefrontal cortical neurons. We have demonstrated that the plasticity-promoting properties of psychedelics require TrkB, mTOR, and 5-HT2A signaling, suggesting that these key signaling hubs may serve as potential targets for the development of psychoplastogens, fast-acting antidepressants, and anxiolytics. Taken together, our results suggest that psychedelics may be used as lead structures to identify next-generation neurotherapeutics with improved efficacy and safety profiles.

*See the entire study here :
Last edited:

mr peabody

Moderator: Music Discussion, PM
Staff member
Aug 31, 2016
Frostbite Falls, MN

High doses of ketamine found to temporarily 'switch off' the brain

University of Cambridge | 11 Jun 2020

Researchers have identified two brain phenomena that may explain some of the side-effects of ketamine. Their measurements of the brain waves of sheep sedated by the drug may explain the out-of-body experience and state of complete oblivion it can cause.

Researchers have identified two brain phenomena that may explain some of the side-effects of ketamine. Their measurements of the brain waves of sheep sedated by the drug may explain the out-of-body experience and state of complete oblivion it can cause.

In a study aimed at understanding the effect of therapeutic drugs on the brains of people living with Huntington's disease, researchers used electroencephalography (EEG) to measure immediate changes in the animals' brain waves once ketamine -- an anaesthetic and pain relief drug -- was administered. Low frequency activity dominated while the sheep were asleep. When the drug wore off and the sheep regained consciousness, the researchers were surprised to see the brain activity start switching between high and low frequency oscillations. The bursts of different frequency were irregular at first, but became regular within a few minutes.

"As the sheep came around from the ketamine, their brain activity was really unusual," said Professor Jenny Morton at the University of Cambridge's Department of Physiology, Development and Neuroscience, who led the research. "The timing of the unusual patterns of sheep brain activity corresponded to the time when human users report feeling their brain has disconnected from their body."

She added: "It's likely that the brain oscillations caused by the drug may prevent information from the outside world being processed normally."

The findings arose as part of a larger research project into Huntington's disease, a condition that stops the brain working properly. The team want to understand why human patients respond differently to various drugs if they carry the gene for this disease. Sheep were used because they are recognised as a suitable pre-clinical model of disorders of the human nervous system, including Huntington's disease.

Six of the sheep were given a single higher dose of ketamine, 24mg/kg. This is at the high end of the anaesthetic range. Initially, the same response was seen as with a lower dose. But within two minutes of administering the drug, the brain activity of five of these six sheep stopped completely, one of them for several minutes -- a phenomenon that has never been seen before.

"This wasn't just reduced brain activity. After the high dose of ketamine the brains of these sheep completely stopped. We've never seen that before," said Morton. Although the anaesthetised sheep looked as though they were asleep, their brains had switched off. "A few minutes later their brains were functioning normally again -- it was as though they had just been switched off and on."

The researchers think that this pause in brain activity may correspond to what ketamine abusers describe as the 'K-hole' -- a state of oblivion likened to a near-death experience, which is followed by a feeling of great serenity. The study was published in the journal Scientific Reports.

Ketamine abusers are known to take doses many times higher than those given to the sheep in this research. It is also likely that progressively higher doses have to be taken to get the same effect. The researchers say that such high doses can cause liver damage, may stop the heart, and be fatal.

To conduct the experiment sheep were put into veterinary slings, which are commonly used to keep animals safe during veterinary procedures. Different doses of ketamine were given to 12 sheep and their brain activity recorded with EEG.

Ketamine was chosen for the study because it is widely used as a safe anaesthetic and pain-relief drug for treating large animals including dogs, horses and sheep. It is also used medically, and is known as a 'dissociative anaesthetic' because patients can appear awake and move around, but they don't feel pain or process information normally -- many report feeling as though their mind has separated from their body.

At lower doses ketamine has a pain-relieving effect, and its use in adult humans is mainly restricted to field situations such as frontline pain-relief for injured soldiers or victims of road traffic accidents.

"Our purpose wasn't really to look at the effects of ketamine, but to use it as a tool to probe the brain activity in sheep with and without the Huntington's disease gene," said Morton. "But our surprising findings could help explain how ketamine works. If it disrupts the networks between different regions of the brain, this could make it a useful tool to study how brain networks function -- both in the healthy brain and in neurological diseases like Huntington's disease and schizophrenia."

Ketamine has recently been proposed as a new treatment for depression and post-traumatic stress disorder. Beyond its anaesthetic actions, however, very little is known about its effects on brain function.

"We think of anaesthetic drugs as just slowing everything down. That's what it looks like from the outside: the animals basically go to sleep and are unresponsive, and then they wake up very quickly. But when we looked at the brain activity, it seems to be a much more dynamic process," said Morton.

Last edited:

mr peabody

Moderator: Music Discussion, PM
Staff member
Aug 31, 2016
Frostbite Falls, MN

MDMA neurotoxicity/brain damage

Is MDMA neurotoxic?

by Aimee Sarmiento, PharmD 2021 and Benjamin Malcolm, PharmD, MPH, BCPP | 10 June 2020

Neurotoxicity resulting in low mood, anxiety, insomnia and problems with cognition has been reported and well documented in literature relating to use of ‘Ecstasy’, but what about pure MDMA?

MDMA-induced neurotoxicity use has been the subject of intense controversy and even scandal within the medical community over decades. As MDMA assisted psychotherapy progresses toward approval as a legal therapeutic entity, the questions regarding neurotoxicity of MDMA will be clinically relevant as patients and providers consider risks and benefits of use. Moreover, clinical trials are shedding some light on thresholds for neurotoxic effects and providing high quality data to help guide safe use.

In this post we’ll give some background information on what neurotoxicity is, how MDMA may act as a neurotoxin, summarize research findings on MDMA-induced neurotoxicity, and explain use parameters that are likely to prevent development of significant neurotoxic effects.

“Recommendations” for harm reduction in persons that choose to use MDMA or ‘Ecstasy’ are not intended to condone the use of illicit substances or recommending their use. This article is for information, education, and harm-reduction purposes. The authors recommend you do not break the law.

First of all, what IS neurotoxicity?

Neurotoxicity involves damage to neurons of the central or peripheral nervous system. It can be caused by several things, including drugs. One common misconception is that neurotoxicity exists as an all or nothing event - that is your neurons are either healthy or destroyed, although it is best understood as a symptom spectrum that can be reversible in some cases and irreversible in others. It can begin with small functional deficits, which gradually progress to a larger-scale functional decline or symptoms.

In the case of neurotoxicity as it relates to MDMA, this progression may present itself in an insidious manner, whether it be gradual loss of memory, behavioral and mood issues, or decreased cognitive function over repeated exposures. Of course, large overdoses can produce severe neurotoxicity rapidly. Drugs can produce neurotoxicity through many different types of mechanisms, too many to recount here, although it can be noted that a drug may have a dosing window in which no neurotoxicity is observed due to the dose being modest enough to not push a biological system to such extreme points that toxic responses occur. Even substances we consider completely benign and healthful, such as water, can become neurotoxic under particular circumstances (e.g. drinking 2 gallons in a single hour).

Therefore, our question of MDMA-induced neurotoxicity needs more nuance than ‘is it neurotoxic or not?’ The answer is “yes it is”, but it’s misleading. This is because it inevitably depends on the circumstances of administration. Instead, the better question to ask is ‘under what conditions does MDMA cause neurotoxicity?’

How does MDMA lead to neurotoxicity?

Before discussing how MDMA leads to neurotoxicity, it may be helpful to review how MDMA works. MDMA acts on several neurotransmitter systems, although primarily releases serotonin from presynaptic nerve terminals, resulting in acute depletion of serotonin stores and synaptic flooding of serotonin. This has led researchers to believe neurotoxicity is a consequence of damage to serotonergic neurocircuitry. Subsequently, evidence of neurotoxicity for MDMA is mainly quantified through 5-HT (serotonin) concentrations, activity or presence of enzymes involved with serotonin synthesis or transport (TH and SERT), and visualization of axons immunoreactive for 5-HT or SERT through imaging.

Mechanisms of MDMA-induced neurotoxicity

Sometimes it may be as simple as the dose determining if poisonous or not. With higher doses there is a greater chance of profound serotonin depletion. Frequency of use is also probably consequential as dosing on a weekly basis (as some may be if frequenting raves often) is not enough time to allow for recovery from prior use, even in some taking doses in clinical environments. This is rather simple in that your serotonin neuron simply has a capacity for how high the concentrations can be without damage and how fast it can return to balance after use. The two may go hand in hand as frequent use can produce tolerance and result in increased doses when used.

Other explanations are more complicated. For example, MDMA is metabolized to reactive metabolites such as HHMA or HHA, which can cause cellular damage. The serotonin excess may also lead to lasting reduction in gene expression resulting in lowered expressed of 5HT or SERT. The latter hypothesis has challenges the ‘neurodegenerative’ theory of MDMA induce neurotoxicity which argues for direct damage to serotonin neurocircuits and argues instead for a genetic reason for changes in serotonin function.

It is crucial to note that MDMA’s neurotoxic potential is increased by other substances such as alcohol or amphetamine, which are commonly taken in conjunction with MDMA at nightlife events. Concurrent substances can be ingested either advertently or inadvertently as many Ecstasy tablets contain adulterants or misrepresented substances. For example, one analysis found almost half of ecstasy tablets continued < 67% MDMA and that caffeine and amphetamine were common adulterants. Other common adulterants include novel psychoactive substances such as other phenethylamine (amphetamine) based drugs. Many events sell alcohol or may limit water availability. In combination, both alcohol and MDMA exert neurotoxic effects by impairing the survival of neuronal precursors in the hippocampal dentate gyrus, an area of the brain important for neuronal generation, learning, and memory. Beyond alcohol or amphetamine(s), rave scenes can expose users to myriad other substances, such as GHB (gamma-hydroxybutyrate) or ketamine, which may also increase risk. Recently and tragically, other adulterants from diverse drug classes have began to be found on ecstasy tablets such as fentanyl. Ecstasy is recommended to be tested for presence of desired agent (MDMA) and absence of deadly adulterants (fentanyl) by harm reduction organizations. Further information and testing kits for purchase can be found at dancesafe.org

Other aspects of MDMA use that may contribute to neurotoxicity is the recreational environment and timing of administration. Users may stay up all night on MDMA and related stimulants, which could trigger a sleep deprived state. Sleep deprivation is well known to impair cognition and lower mood, thus plausibly leads to some of the observed week after effects. This issue may be minimized by timing MDMA administration such that the user can fall asleep at a regular time or with minimal disruption to habitual circadian patterns. Another factor that may play a role in toxicity is the bioenergetic stress (namely thermal stress) secondary to MDMA being used in hot environments, raising core body temperatures, and being associated with high output physical activity.

Though the jury is still out on exactly how MDMA exerts its neurotoxic effects, the bottom line is that MDMA has several underlying explanations of why it could be neurotoxic as well as evidence from recreational use environments that it can be neurotoxic under certain circumstances. There circumstances tend to work in conjunction within subcultures that use Ecstasy heavily.

At what point does MDMA become neurotoxic?

The potential for MDMA to act as a neurotoxin is determined by the dose, frequency of exposure, timeline of exposure, concurrent substance use, and overall setting or context in which it is administered.

Doses of 1.7mg/kg (~125mg in a ~160lb adult) have been evaluated in clinical laboratory conditions without evidence of damage to serotonin neurocircuits or functional deficits. In the six phase II trials of MDMA-AP for PTSD conducted thus far, doses have ranged between 75-187.5mg and participants have undergone 2-3 experimental sessions spaced 3-5 weeks apart. In phase III trials, the maximum dose in any session will be 120mg as an initial dose and 60mg as a booster dose (180mg total) and all participants will undergo 3 sessions. In phase II, there were no detected changes in neurocognitive function over a 2-month period following the second and third experimental sessions. During MDMA-AP sessions participants reported a range of short-lived side effects characteristic of amphetamine use and mild serotonin excess. In the week after use fatigue peaked on day 1 and decreased towards day 6, the need for more sleep peaked on day 2 and decreased towards day 6, and low mood peaked between days 2-4 and decreased towards day 6. Notably, in 4 of the 6 phase II studies, depression scores were measured and showed greater improvements than placebo over the course of the trial. In summary, it’s clear that a course of 2-3 MDMA sessions spaced at least a month apart utilizing doses of 75-125mg followed by a booster dose of 37.5mg-75mg 2-3 hours later does not produce persistent neurotoxic effects.

In the previously mentioned study of MDMA samples, the ones that contained >67% MDMA averaged 205mg per dose, higher than the initial and booster doses in MDMA-AP combined (however weights include fillers thus is likely an overestimate of MDMA per tablet). Consumption of multiple tablets (stacking) can bring doses up to 0.5g or more throughout an evening for some users; similar to amounts consumed in all 3 MDMA-AP sessions combined. Stacking was investigated in laboratory conditions by giving 100mg of MDMA followed by another 100mg of MDMA 4 hours later. Results demonstrated that blood concentrations, cardiovascular stress, and temperature were all increased, however subjective intensity of effect was similar due to development of rapid tolerance to pleasurable effects. This may cause users to underestimate the physical stress they are placing on their body with repeated dosing.

So, what does the research tell us so far?

There are two converging streams of evidence here. One from recreation literature and Ecstasy use suggesting potential for neurotoxicity. The other from clinical environments in which MDMA is predominantly therapeutic, has a promising safety profile, and lacks evidence of clinically significant neurotoxicity despite using measures that could reasonably detect such problems. Therefore, models of dosing from clinical trials could be extrapolated and generalized into a harm reduction framework that could be applied broadly to persons who use MDMA.

While phase II trials did detect several cases of persons experiencing side effects of MDMA in the week after their session, the changes were transient and resolved after a week in almost all cases. This suggests that transient changes to mood, sleep, or cognition in the week after use is a side effect that persons should be educated about, while neurotoxicity is an adverse drug reaction associated with ‘incorrect’ or harmful administration patterns.
How can MDMA-induced neurotoxicity be avoided?

The good news is that avoiding or severely limiting the potential for significant neurotoxicity occurring with MDMA is straight-forward, not difficult, and preserves the ability for profound benefits.

- Limit MDMA doses to the moderate range 75-125mg initially
- Limit dosing in a single session to a single booster of 50% the original dose
- Space MDMA sessions at least one month apart
- Limit dosing to 3-4 times per calendar year
- Avoid using other drugs and alcohol with MDMA
- Test ‘ecstasy’ tablets for presence of MDMA prior to use (as well as absence of other drugs such as fentanyl)
- Limit sleep disruption due to MDMA use
- Plan regular breaks and sip water if exposed to hot environments (do not drink excessive amounts of water)

If you notice a pattern of low mood or cognitive problems in the days after MDMA use that seems to worsen upon repeated administration or is persistent beyond a few days post-use then take an extended break from using MDMA or similar substances

Do antidepressants or 5-HTP prevent MDMA-induced neurotoxicity?

Serotonin blocking antidepressants such as Selective Serotonin Reuptake Inhibitors (SSRIs e.g. Prozac or fluoxetine) can diminish neurotoxic potential of MDMA. However, this is a poor strategy overall because serotonin blocking antidepressants also greatly diminish the subjective experience of MDMA.

Supplements such as L-tryptophan or 5-HTP are able to boost serotonin synthesis. It may not be wise to take large doses prior to MDMA use as it could increase the risk of physical side effects of serotonin excess such as nausea, vomiting, or diarrhea. Conversely, it may be reasonable to use L-tryptophan or 5-HTP for a few days after MDMA use to aid in restoring serotonin levels.

Neither antidepressants nor serotonin precursor supplementation has adequate data suggesting benefits to recommend use. While further research and testing of interventions to reduce either week after side effects or prevent neurotoxic effects is encouraged, current clinical trial data do not support such measures as necessary.

Summary & conclusions

Neurotoxicity can manifest in many forms and severity levels and MDMA appears to be neurotoxic under certain circumstances. High doses or several doses throughout the period of ingestion, frequent use (e.g. weekly use at nightlife events), aduleration, and substance mixing all are implicated in the adverse event of MDMA neurotoxicity. Data from phase II trials of MDMA-AP do not support risks of clinically significant neurotoxicity from MDMA use as side effects were observed to be mild and limited to the week after use, while participants also saw profound improvements in clinical symptoms for PTSD. In summary, MDMA induced neurotoxicity is easy to avoid with moderate doses, adequate time between use, and pure substances without combination.

Last edited:


Moderator: DiTM
Staff member
Aug 16, 2019
your dad's house
Glad to see some real pharmacists weighing in on this rather than just us bozos. Not that experts are infallible but surely they are well versed in drug mechanism..

mr peabody

Moderator: Music Discussion, PM
Staff member
Aug 31, 2016
Frostbite Falls, MN

The effects of psychedelics on the brain's 'consciousness conductor'

by Rich Haridy | Vew Atlas | 14 Jun 2020

New research reveals psilocybin seems to reduce neural activity in a brain region called the claustrum.

A new Johns Hopkins study, looking at how psilocybin influences a mysterious brain region called the claustrum, is just one of several compelling recent articles shining a light on how our brains generate our experience of consciousness.

In 2004, Francis Crick, one of the 20th century’s greatest scientific minds, died of colon cancer. Crick was best known for describing the structure of DNA in the 1950s with collaborator James Watson, but over the last couple of decades of his life his research focused on perhaps the biggest scientific question of them all: how does our brain generate what we consider to be consciousness?

The last paper Crick ever penned homed in on a small and still relatively mysterious brain region called the claustrum. Co-authored with Christof Koch, Crick was reportedly still editing the manuscript in hospital the day he died. Subsequently published in 2005, the paper presented a novel hypothesis - the claustrum may be key to our experience of consciousness, unifying and co-ordinating disparate brain areas to help generate our singular experience.

“The claustrum is a thin, irregular, sheet-like neuronal structure hidden beneath the inner surface of the neocortex in the general region of the insula,” wrote Crick and Koch in the landmark paper. “Its function is enigmatic. Its anatomy is quite remarkable in that it receives input from almost all regions of cortex and projects back to almost all regions of cortex.”

The extraordinarily unique way the claustrum connects different brain regions fascinated Crick. While some researchers had previously suggested the claustrum could potentially be the brain’s epicenter of consciousness, Crick and Koch presented a different analogy to describe the role of this mysterious brain region.

“We think that a more appropriate analogy for the claustrum is that of a conductor coordinating a group of players in the orchestra, the various cortical regions,” the pair wrote. “Without the conductor, the players can still play but they fall increasingly out of synchrony with each other. The result is a cacophony of sounds.”

It's like a highway

A new study, published in the journal Current Biology, is describing in unprecedented detail how the claustrum communicates with other brain regions. The project, an international collaboration between researchers in Sweden and Singapore, somewhat backs up Crick’s "consciousness conductor" hypothesis, revealing the claustrum is less like a singular hub for cortical inputs and more like a collection of specialized synaptic pathways connecting specific cortical regions.

“We found that the synaptic connectivity between the cortex and claustrum is in fact organized into functional connectivity modules, much like the European route E4 highway or the underground system,” says Gilad Silberberg, lead author on the study, from the Karolinska Institutet.

Another recent and even more focused study zoomed in on the claustrum’s role in coordinating slow-wave brain activity. A team from Japan’s RIKEN Center for Brain Science generated a transgenic mouse model in which they could artificially activate neurons in the claustrum through optogenetic light stimulation.

... it is so exciting that we are getting closer to linking specific brain connections and actions with the ultimate puzzle of consciousness. - Yoshihiro Yoshihara

The research discovered slow-wave activity across a number of brain regions increased in tandem with neural firing in the claustrum. Slow-wave brain activity is most often linked to a key period of sleep associated with memory consolidation and synaptic homeostasis.

“We think the claustrum plays a pivotal role in triggering the down states during slow-wave activity, through its widespread inputs to many cortical areas,” says Yoshihiro Yoshihara, team leader on the new RIKEN research. “The claustrum is a coordinator of global slow-wave activity, and it is so exciting that we are getting closer to linking specific brain connections and actions with the ultimate puzzle of consciousness.”

So, if increased claustrum activity seems to orchestrate a kind of synchronized slowing down of brain activity across a number of different cortical regions, what happens when claustrum activity is suppressed?

The claustrum under the influence of psychedelics

One hypothesis has suggested dysfunctional claustrum activity could play a role in the subjective effects of psychedelic drugs. One of the fundamental neurophysiological characteristics of a psychedelic experience is widespread dysregulation of cortical activity. Brain networks that don’t normally communicate will suddenly spark up connections under the influence of psilocybin or LSD. So a team from Johns Hopkins University set out to investigate exactly how psilocybin influences claustrum activity.

Due to the claustrum’s location in the brain its activity has traditionally been quite difficult to study in humans. However, a recently developed functional magnetic resonance imaging (fMRI) technique has afforded researchers a new and detailed way to measure claustrum activity. The Johns Hopkins study recruited 15 subjects to measure claustrum activity after either a placebo or a dose of psilocybin.

The study found psilocybin reduced claustrum neural activity between 15 and 30 percent. The overall reductions in claustrum activity also directly correlated with the subjective psychedelic effects of the drug.

More specifically, psilocybin seemed to significantly alter how the claustrum communicated with a number of brain regions fundamentally involved in attentional tasks and sensory processing. For example, under the influence of psilocybin, functional connectivity between the right claustrum and the auditory and default mode networks significantly decreased, while right claustrum connectivity with the fronto-parietal task control network increased.

“Our findings move us one step closer to understanding mechanisms underlying how psilocybin works in the brain,” says Frederick Barrett, one of the authors on the new study. “This will hopefully enable us to better understand why it’s an effective therapy for certain psychiatric disorders, which might help us tailor therapies to help people more.”

As Barrett suggests, this new insight into the effect psilocybin has on claustrum activity may shine a light on how this psychedelic drug generates its beneficial therapeutic effects. Psilocybin in particular has been found to be significantly useful in treating major depression and substance abuse disorders. The Johns Hopkins scientists hypothesize psilocybin’s action on the claustrum may play a key role in both the subjective effects of this psychedelic drug, and its beneficial therapeutic outcomes.

Further research is certainly necessary to verify this hypothesis, and the next step for the Johns Hopkins team will be to use this new claustrum imaging technique to investigate the brain region in subjects with a variety of psychiatric disorders. Fifteen years on from Francis Crick’s passing his final work is still inspiring new research. The new wave of psychedelic science, in tandem with novel neuroimaging techniques, brings us closer and closer to understanding how our brains create consciousness.

The new study was published in the journal Neuroimage.

Source: Johns Hopkins Medicine

Last edited:

mr peabody

Moderator: Music Discussion, PM
Staff member
Aug 31, 2016
Frostbite Falls, MN

Image (a) is an untreated brain, while image (b) is a brain on psilocybin.

New insights into psilocybin’s effect on the brain

by Georgia Perry | Lucid News | 27 April 2020

Scientists get closer to understanding why psychedelics show promise for treating mental illness.

In an effort to explore the effect of psilocybin on the healthy brain, an international team of scientists created a biophysically realistic whole-brain model, described by the researchers as a “technical tour de force.” The researchers reported on this model in a paper published in the Proceedings of the National Academy of Sciences.

The groundbreaking model enabled them to observe how psilocybin impacts the activity of neurons and neurotransmitters. “Longer term, this could provide a better understanding of why psilocybin is showing considerable promise as a therapeutic intervention for neuropsychiatric disorders including depression, anxiety, and addiction,” they wrote in the paper.

Psilocybin and other psychedelics are known to affect the neurotransmitter balance of serotonin receptors in the brain, but up to this point “little has been known of this process,” write two of the paper’s authors, Morten Kringelbach and Gustavo Deco, in an email to Lucid News. The model they created sheds new light on these dynamics. “Using this model will be crucial for truly understanding how psilocybin can rebalance neuropsychiatric disorders such as treatment-resistant depression and addiction,” the researchers added.

To create the whole-brain model they analyzed functional resonance imaging (fMRI) data from 16 healthy subjects. Then, nine subjects each underwent two fMRI scans over separate sessions, in which they were given either a 2mg dose of psilocybin or a placebo saline solution. "The findings reveal that when psilocybin was introduced, neural networks were disrupted and neurotransmitters forged new pathways between neurons,” writes Mental Daily.

“It has long been a puzzle how the brain’s fixed anatomical connectome can give rise to so many radically different brain states; from normal wakefulness to deep sleep and altered psychedelics states,” write Kringlebach and Deco. The whole brain model they created is capable of addressing this puzzle, in addition to advancing scientific understanding of psilocybin.

"The new model," they write, “will give us the much needed, causal tools for potentially designing new interventions to alleviate human suffering in neuropsychiatric disorders.”

They are currently using the model for a new psilocybin study for depression, conducted by Dr. Robin Carhart-Harris, who was also involved in this study.

Last edited:

mr peabody

Moderator: Music Discussion, PM
Staff member
Aug 31, 2016
Frostbite Falls, MN

Psychedelic drugs affect how the brain is wired

by David Olson | The Conversation | The Daily Beast

Researchers know that mind-altering drugs including LSD, DMT, and MDMA affect brain function, but new findings show they can alter the structure of the brain as well.

It seems that psychedelics do more than simply alter perception. According to the latest research from my colleagues and me, they change the structures of neurons themselves.

My research group has been studying the effects of psychedelics on neuronal structure and function, and we found that these compounds cause neurons to grow. A lot. Many of these compounds are well-known and include lysergic acid diethylamide (LSD), psilocin (from magic mushrooms), N,N-dimethyltryptamine (DMT, from ayahuasca) and 3,4-methylenedioxymethamphetamine (MDMA, aka ecstasy).

These are among the most powerful drugs known to affect brain function, and our research shows that they can alter the structure of the brain as well. Changes in neuronal structure are important because they can impact how the brain is wired, and consequently, how we feel, think and behave.

Prior to our study, few compounds were known to have such drastic and rapid effects on neuronal structure. One of those was ketamine—a dissociative anesthetic and quite possibly the best fast-acting antidepressant that we have available to us at the moment.

If you think of a neuron like a tree, then its dendrites would be the large branches, and its dendritic spines—which receive signals from other neurons—would be the small branches. Some of these small branches might have leaves, or synapses in the case of a neuron. In fact, neuroscientists often use terms like “arbor” and “pruning” much like a horticulturist would.

“When we grew neurons in a dish—which is not unlike growing a plant in a pot—and fed them psychedelic compounds, the neurons sprouted more dendritic branches, grew more dendritic spines, and formed more connections with neighboring neurons.”

Thanks to studies on ketamine, slow-acting antidepressants and chronic stress models of depression, scientists now know that depression is not simply the result of a “chemical imbalance,” as pharmaceutical companies like to suggest. It is far more complicated and involves structural changes in key neural circuits that regulate emotion, anxiety, memory and reward.

Rethinking depression

One of the hallmarks of depression is the atrophy of neurons in the prefrontal cortex—a region of the brain that controls anxiety and regulates mood among other things. Basically, these branches and spines shrivel up, disconnecting from other neurons in the brain. One hypothesis for why ketamine is so effective is because it can rapidly regrow the arbors and spines of these critical neurons.

Like ketamine, psychedelics have shown promise in the clinic for treating neuropsychiatric diseases. The DMT-containing herbal tea known as ayahuasca produces fast-acting antidepressant effects within a day, psilocybin eases the anxiety of terminally ill cancer patients and MDMA can reduce fear in those suffering from post-traumatic stress disorder(PTSD). Our recent papers suggest the intriguing possibility that psychedelic compounds and ketamine might share a common therapeutic mechanism.

Strictly speaking, a psychedelic is a “mind-manifesting” drug—a definition that’s open to interpretation. They tend to produce perceptual distortions or hallucinations by activating 5-HT2A receptors. Our research group has found that compounds typically regarded as psychedelics, like LSD and DMT, as well as those that are sometimes called psychedelics, like MDMA, and those that are not usually called psychedelics, like ketamine, are all capable of profoundly impacting neuronal structure.

Psychedelics vs. Psychoplastogens

Our group has coined the term “psychoplastogen” to refer to such compounds, and we believe that these molecules may hold the key to treating a wide variety of brain diseases.

Our studies on neurons grown in dishes, as well as experiments performed using fruit flies and rodents, have demonstrated that several psychoplastogens, including psychedelics and ketamine, encourage neurons to grow more branches and spines. It seems that all of these compounds work by activating mTOR—a key protein involved in cell growth.

The biochemical machinery that regulates mTOR activity is intricate. As we tease apart how psychedelics and other psychoplastogens turn on mTOR signaling, we might be able to engineer compounds that only produce the therapeutic effects on neuronal growth while bypassing pathways that lead to undesired hallucinations.

The field has known for some time now that psychedelics can produce lasting positive effects on brain function, and it’s possible that these long-lasting changes result from the psychoplastogenic effects of these drugs. If true, this would suggest that psychoplastogens might be used to repair circuits that are damaged in mood and anxiety disorders.

Many diseases, such as depression and anxiety disorders, are characterized by atrophy of dendritic branches and spines. Therefore, compounds capable of rapidly promoting dendritic growth, like psychedelics, have broad therapeutic potential. The number of papers demonstrating that psychedelics can produce therapeutic effects continues to grow every year.

Panacea or poison?

However, we should temper our enthusiasm because we do not yet know all of the risks associated with using these drugs. For example, it’s possible that promoting neuronal growth during development could have negative consequences by interfering with the normal processes by which neural circuits are refined. We just don’t know, yet.

Similarly, it is unclear what effects psychoplastogens will have on the aging brain. It’s important to keep in mind that excessive mTOR activation is also associated with a number of diseases including autism spectrum disorder (ASD) and Alzheimer’s disease.

To me, it’s obvious that we need to understand how these powerful compounds affect the brain, in both positive and negative ways, if we hope to fully comprehend the fundamental laws governing how the nervous system works and how to fix it when it doesn’t.

By David E. Olson, assistant Professor, Department of Chemistry; Department of Biochemistry & Molecular Medicine; Center for Neuroscience, University of California, Davis

Last edited:

mr peabody

Moderator: Music Discussion, PM
Staff member
Aug 31, 2016
Frostbite Falls, MN

Examining the cognitive neuroscience of psychedelics

by Joel Ng, MA | Psychedelic Science Review | 4 Aug 2020

Much research has been conducted into the use of psychedelics in concurrence with therapy as novel treatments for a host of mental disorders, from OCD (obsessive-compulsive disorder) to depression to substance abuse. Such research has only just started to regain popularity after the widespread ban on psychedelic substances in the 1970s. However, less is known about how psychedelics work on a granular level. A deeper understanding of psychedelics, and being able to closely tie neurochemical changes caused by psychedelics with subjective experiences, could expand our understanding of the brain and advance mental health care greatly.

A recent paper published by Drs. Robin Carhart-Harris and Karl Friston in the Journal Pharmacological Reviews suggests a new way of explaining how psychedelics affect the brain’s way of understanding one’s environment, and by extension, provides a potential explanation for how psychedelics work in treating mental disorders. Their model is known by the acronym REBUS, which stands for “relaxed beliefs under psychedelics.”

The REBUS model and prior beliefs

The authors view the brain as an engine that generates mental models of the world with the purpose of predicting future sensory data. These predictions are called ‘priors’, meaning ‘prior beliefs’. Much of human behaviour and cognition are based on these priors. For example, there is the mental model of how a washroom sink functions – turn this knob, water comes out. This can be extended beyond beliefs about the physical nature of the world, also including more abstract beliefs. For example, drug addiction could be viewed through the lens of a prior: “taking this drug leads to large amounts of reward.”

Usually, priors that are inaccurate reflections of the world are updated by incoming sensory data. New information from the senses updates the model to better reflect reality. Sometimes, however, new sensory information is ignored by the brain, due to priors being too rigid to be updated. An example of this is drug addicts who keep abusing their drug of choice even after the negative aspects of drug addiction cause significant damage to their lives or depressed individuals with otherwise comfortable lives. Carhart-Harris and Friston suggest that psychedelics decrease the rigidity of priors – making priors more malleable to incoming sensory data. Likening this process to heating a metal to increase its plasticity, psychedelics allow new information to better mould the prior into something more reflective of reality. This model explains how psychedelic trips can often result in long-term changes in individuals even after the trip ends by promoting a reorganization of the brain’s way of perceiving the world.

What does the dissolution of priors feel like on the individual level? One of the most accepted models of personality, the five-factor model of personality, describes five domains of personality – Openness, neuroticism, extraversion, agreeableness, and conscientiousness. Openness, the factor of interest in this case, broadly describes an individual’s level of affinity for new experiences, people, and viewpoints. Personality analyses of subjects who ingested psilocybin showed significant increases in their openness domains over a year after the psilocybin dose. Viewing these results, broadly, this dissolution of priors feels like an imposed sort of openness of mind, where the psychedelic compound forces one to be more open-minded regarding incoming information. As information about the world enters the consciousness – information that might otherwise have been previously ignored to retain prior cognitive structures – psychedelic mind-states are less able to discount this information, which subjectively feels like being more open-minded.

Learning from the brains of children

Psychedelics’ effect on the serotonin 5-HT2 receptors in the brain has been theorized as the main neurochemical process by which psychedelics exert an effect on subjective experience. Interestingly, psychedelics induce certain biological events that are similar to childhood. First, serotonin receptors, which are the primary receptors that psychedelics act on, are more numerous in children than in adults. Second, neurogenesis and brain plasticity, both traits that are more pronounced in childhood brains, are induced by psychedelics. All in all, the evidence indicates that psychedelic states of mind and childhood states of mind bear great similarities. Learning about the world and the individual’s relationship with it takes place at a critical period in one’s childhood, one that psychedelics might be able to reactivate in adults.

With this implication in mind, one can examine further the interesting mental abilities of children, abilities that fade away as they grow up. For example, enhanced learning potential and enhanced memory are just two of the things that children are capable of that adults are not. Children with an eidetic memory often lose this ability as they grow. This is an ability that has, anecdotally, been retriggered in psychedelic mind-states. However, more rigorous research is required to corroborate these accounts.

Other advantages of the REBUS model

One additional benefit of the REBUS model of viewing psychedelic effects on cognition is that it can explain a host of other subjective effects that psychedelics produce, from ego dissolution, altered time perception, geometric hallucinations, and magical thinking. All of the previously stated subjective effects can be seen as the result of new information altering previously rigid cognitive structures like the ego, subjective experiences of time units, visual recognition and classification of objects, and large-scale worldviews on the nature of reality.

Lastly, and perhaps most importantly, this model allows the psychiatric community to better explain exactly why and how psychedelics seem to be extremely effective at treating certain types of mental illnesses. Conditions including anxiety, depression, PTSD (post-traumatic stress disorder), viewed as maladaptive priors that are immune to sensory information updates in sober mind-states, are able to be integrated with new information in the psychedelic mind-state due to the reduced rigidity of the priors. Depressed individuals remaining depressed after positive events in their lives can be viewed as the prior “I am not worth anything” resisting integration with the new information. This resistance is reduced in the psychedelic mind-state, allowing for psychedelic-enhanced psychotherapy to be more effective than regular psychotherapy.

The impact on psychedelic therapy

In conclusion, authors Carhart-Harris and Friston posit a new model of viewing the brain, one that also explains the subjective and neurochemical effects of psychedelics on the brain. This model sheds more light on many facets of psychedelics, from the recreational, subjective effects of psychedelics to more serious medical effects that might one day pave the way to greater, more effective mental health treatments for a variety of individuals.

Last edited:

mr peabody

Moderator: Music Discussion, PM
Staff member
Aug 31, 2016
Frostbite Falls, MN

Psychedelics and the Default Mode Network

by Jasmine Virdi | Psychedelics Today | 4 Feb 2020

Modern neuroscience has demonstrated that psychedelics such as LSD, psilocybin, the active ingredient in magic mushrooms, as well as ayahuasca operate to significantly reduce activity in the brain’s default mode network (DMN). This reduction in DMN activity functions as a kind of ‘rebooting’ of the brain, and is thought to be linked to one of the most enduring therapeutic effects of psychedelic substances.

What is the Default Mode Network?

The default mode network refers to an interconnected group of brain regions that are associated with introspective functions, internally directed thought, such as self-reflection, and self-criticism. Increased activity of the DMN is correlated with the experience of mind-wandering and our capacity to imagine mental states in others (i.e. theory of mind) as well as our ability to mentally “time travel,” projecting ourselves into the past or future.

The functioning of the DMN is considered essential to normal, everyday consciousness and is at its most active when a person is in a resting state and their attention is not externally directed on a worldly task or stimulus. For example, if you put somebody in an MRI scanner and don’t give them anything to do, their mind will start wandering and you will see the regions that make up the DMN light up.

The functional connections that make up the DMN increase from birth to adulthood, with the DMN not being fully active until later in a child’s development, emerging around the age of five as the child develops a stable sense of narrative self or “ego.”

As we mature, we learn to respond to life’s stimuli in a patterned way, developing habitual pathways of communication between brain regions, particularly those of the DMN. Over time, communication becomes confined to specific pathways, meaning that our brain becomes more ‘constrained’ as we develop. It is these constrained paths of communication between brain regions that quite literally come to constitute our ‘default mode’ of operating in the world, coloring the way we perceive reality.

Evolutionarily speaking, it has been hypothesized that the DMN plays a major role in our survival, helping us form a continuous sense of self, differentiating ourselves from the world around us. The DMN has been described by psychiatrist Matthew Brown as the part of the brain which serves to “remind you that you are you."

Overactivity of the DMN and mental health conditions

The DMN has been found to be particularly overactive in certain mental health conditions, such as depression, anxiety, and OCD. Matthew Brown likens DMN overactivity to experiences of “hypercriticality,” “rigid thought patterns," and “automatic negative thought loops” about oneself.

Imagine that you are at a party, telling a joke that gets met with an awkward silence. Initially, people might think “Oh no, that wasn’t so funny,” but they tend to quickly move on to the next leg of the conversation, forgetting about it entirely. However, you go home that evening, finding yourself completely unable to sleep because you are wrought with worry about the bad joke you told, what a fool you appeared to be, and how others might be judging you harshly for it. This is a classic example of DMN overactivity and the negative thought patterns which tend to be visible in people who suffer from depression, anxiety, and OCD.

How do psychedelics affect the Default Mode Network?

Psychiatric doctor and ayahuasca researcher Simon Ruffell likens the effects of psychedelics on the DMN to “defragmenting a computer.” When you ingest a psychedelic, activity of the DMN is significantly decreased whilst connectivity in the rest of the brain increases.​
“Brain imaging studies suggest that when psychedelics are absorbed they decrease activity in the default mode network. As a result the sense of self appears to temporarily shut down, and thus ruminations may decrease. The brain states observed show similarities to deep meditative states, in which increased activity occurs in pathways that do not normally communicate. This process has been compared to defragmenting a computer. Following this, it appears that the default mode network becomes more cohesive. We think this could be one of the reasons levels of anxiety and depression appear to reduce.”
Dr. Simon Ruffell, Psychiatrist and Senior Research Associate at King’s College London​

Due to psychedelics’ ability to disrupt the activity of the DMN, they have a particularly strong therapeutic potential when it comes to changing negative thought patterns. For example, a study by Imperial College London assessed the impact of psilocybin-assisted therapy on twelve patients with severe depression. Results demonstrated that psilocybin-assisted therapy was able to dramatically reduce their depression scores for a period of up to three months.

A follow-up study suggested that the therapeutic impact of psilocybin was linked to its ability to ‘reset’ the DMN, turning it off and reconsolidating it in a way that is a little less rigid than before.

In general, it has been shown that psychedelics produce increases in psychological flexibility, positing another explanation for why we see decreases in depression and anxiety following a psychedelic experience. Based on what we know about the DMN, we could hypothesize that it plays an influential role in one’s ability to be psychologically flexible.

Matthew Brown gave an analogy for how psychedelics are able to reset the DMN, enabling an increased sense of psychological flexibility:​
“If you do the same thing repeatedly, it is like you are walking down the same path all the time. Naturally, that path becomes very well worn and easy to walk down. However, you realize that maybe there is another path that might be more advantageous for you and you want to try walking down that path. Psychedelics ‘mow the lawn’ so that it doesn’t seem that the weeds are quite so high and you can walk down that new path a little bit more easily.”

Entropic brain theory and the 'reducing valve

Psychedelics tend to disrupt the activity of the DMN, temporarily disintegrating the highly organized system of networks that it is made up of, allowing for “less ordered neurodynamics”, and a greater degree of entropy within the brain. That is to say that open, freer conversations begin to take place between brain regions that are normally kept separate.

According to the ‘entropic brain’ theory, the state of consciousness associated with psychedelics is comparable to that which exists in early childhood – we experience awe and wonder, looking at everything in the world around us as wholly novel.

These findings are in line with writer and philosopher Aldous Huxley’s early reflections on the psychedelic experience, in which he described psychedelic consciousness as “Mind at Large” in that it grants us access to a larger set of brain functions, allowing us to tap into an unbounded state of consciousness which extends beyond the individual and into the collective. He theorized that in order “to make biological survival possible, Mind at Large has to be funneled through the reducing valve of the brain and nervous system.”

In this case, we can think of the “reducing valve” as a metaphor for the DMN which in some sense serves “to protect us from being overwhelmed and confused by this mass of largely useless and irrelevant knowledge, […] and leaving only that very small and special selection which is likely to be practically useful.”

The Default Mode Network and ego death

In 2016, a breakthrough study by Imperial College London used a combination of neuroimaging techniques to measure electrical activity and experiential reports from participants to investigate the link between brain activity and reported psychological responses to LSD in twenty volunteers.

Results demonstrated that LSD dampens the function of the DMN, and that this decrease in activity strongly correlated with the subjective experience of “ego dissolution” or “ego death”, indicating that the DMN performs a vital part in sustaining the “ego” or “self.”

Similarly, researchers at Johns Hopkins University published a pioneering study, demonstrating that psilocybin is able to produce mystical-type experiences in participants, such as the experience of ego death. These experiences were considered to be deeply meaningful by participants and were seen to elicit sustained positive changes in attitude and behaviour.

Generally, it’s our ego – our sense of “I” – that tends to create and harbor negative thought patterns. In conditions such as depression and anxiety, we become self-absorbed, narrowly focused on thoughts about ourselves, unable to take a step back and see the bigger picture. The ego erects boundaries that can lead to us feeling isolated from the people around us, disconnected from nature and even ourselves.

In a state of ego dissolution, these boundaries are let down and a great “zooming out” takes place where you begin to see things on a macroscopic level. You are no longer an individual isolated from life as it takes place around you, but rather you are interconnected with everything through the web of life. It is not a logical, but rather a felt experience of incredible love and reconnection.

When asked about the therapeutic implications of having an experience like ego dissolution, Matthew Brown explained that it can be tremendously healing as our consciousness is able to extend itself beyond the confines of our individual experience, and become one with nature’s larger whole.

“You realize that you are extremely insignificant, and perhaps that sounds defeating. However, it can be very freeing to realize that you are just one human who is existing for a very small blip of time in the grand scheme of the universe.” — Dr. Matthew Brown, DO, MBA, ABPN, Child, Adolescent, Adult Psychiatry

It is important to note that although experiences of ego death can lead to deep personal insight, and thus have therapeutic benefits, they can also be terrifying. Author of Changing our Minds, Don Lattin reminds us that ego death can be a “fearful and/or enlightening experience” that “depends in large part on whether mind travelers are ready for the journey, what baggage they bring along, and who’s accompanying them.”

Perhaps what is most interesting about the ego death experience, and the temporary rewiring of the brain enabled by psychedelics, is the long-lasting, enduring therapeutic effects that remain beyond the temporality of the drug. The resetting of the DMN combined with the powerful experience of ego death induced by psychedelics are often described as amongst the most meaningful of experiences in a person’s life. Such experiences help us to break free from negative thought patterns, become more psychologically flexible as well as dissolve the barriers between ourselves and the world around us, realizing our place in the interconnected web of life.

Jasmine Virdi is a freelance writer, editor, and proofreader. She currently works for the fiercely independent publishing company Synergetic Press, where her passions for ecology, ethnobotany and psychoactive substances converge. Jasmine’s goal as an advocate for psychoactive substances is to raise awareness of the socio-historical context in which these substances emerged in order to help integrate them into our modern-day lives in a safe, grounded and meaningful way.

Last edited:

mr peabody

Moderator: Music Discussion, PM
Staff member
Aug 31, 2016
Frostbite Falls, MN

The role of the claustrum in the psychedelic experience

by Shane O'Connor, MS | Psychedelic Science Review | 19 Aug 2020

Brain scans show that psilocybin modulates claustrum connectivity in brain networks.

The claustrum is one of the most enigmatic structures in the brain. This thin sheet of subcortical neurons is noteworthy in that it is connected with almost all cortical areas, including motor, somatosensory, visual, auditory, limbic, associative, and prefrontal cortices. Additionally, it receives neuromodulatory input from subcortical structures

Fifteen years ago, Sir Francis Crick (co-discoverer of the double helix structure of DNA) and Christof Koch, Chief Scientist at the Allen Institute for Brain Science, published an influential review making an argument for the claustrum as the ‘seat of consciousness’, driving a renewal of interest in the brain structure. In this paper, the pair suggested a function for the claustrum in binding information to generate the conscious experience.

Brain networks and the claustrum

Due to the nature of its connectivity and neuromodulatory input, the claustrum aids in the differentiation between task-relevant and task-irrelevant information, allowing the organism to ignore irrelevant information and proceed with goal-oriented behaviour. In particular, the claustrum has been linked to brain networks implicated in attention; the default mode network (DMN) and task-positive networks such as the Central Executive Network (CEN).

Similarly, recent findings suggest that psilocybin alters the integrity of and coupling between large-scale brain networks, including the DMN as well as sensory and executive control networks. This observation has led to the hypothesis that psilocybin may modulate claustrum function in humans.

Coronal plane section of the human brain showing the location of the claustrum on one side.

Researchers believe that psilocybin may acutely decrease activity within the DMN, an arrangement of functional connections in the brain that is responsible for introspection and planning. The downregulation of the DMN by psilocybin may temporarily lead to increased connectivity between brain regions that ordinarily don’t communicate with one another, corresponding to the subjective experience of “ego-dissolution” and the consequent generation of new perspectives and insights.

Task-positive networks also play an essential role in the attention and executive. Shifts in attention and executive function also characterise the subjective effects of psychedelic drugs. Psilocybin dose-dependent changes in executive function include impaired associative learning, working memory, and episodic recall.

The observations that psilocybin perturbs the same brain networks that are functionally connected the claustrum implicates the claustrum as a target of psilocybin. Furthermore, the effects of psilocybin are primarily achieved through its action as a partial agonist of the serotonin 2a (5-HT2A) receptor. 5-HT2A receptor protein is highly expressed in the claustrum.

In a recent study, Barrett et al. tested the hypothesis that psilocybin disrupts claustrum activity and functional brain connectivity in humans.

Study design

In the Barrett et al. study, 15 participants completed two brain scanning procedures (fMRI), each beginning 90 min after administration of placebo or a moderate dose of psilocybin (10mg/70kg). The timing of scanning procedures corresponded with peak subjective effects of this dose of psilocybin.

Immediately after each resting-state scan, participants rated the degree to which they experienced a series of subjective effects during their resting-state scan, which included:

The overall strength of psilocybin-like effects.
Now-ness: the feeling of being in the present moment.
Letting go: the degree to which a person was able to let go of control of the experience.
Equanimity: equipoise, felt a sense of being in balance, emotional balance.

Psilocybin modulates claustrum connectivity in brain networks

Brain scanning procedures demonstrated that psilocybin modulated the activity of both left and right claustrum during the acute effects of psilocybin, and led to alterations in both left and right claustrum connectivity with brain networks that support sensory and cognitive processes. In particular, psilocybin decreased functional connectivity of the right claustrum with DMN and increased right claustrum connectivity with task-positive networks.

These results corroborate with pioneering psychedelic studies which demonstrated reductions in DMN connectivity and increases in the connectivity of task-positive networks following psilocybin administration. However, how this network disruption occurs is unclear. The results of the study by Barrett et al. support the idea that the claustrum may be involved at a circuit-level to exert psilocybin-induced disturbances in both the DMN and task-positive networks.

Subjective effects and claustrum function

Furthermore, the Barrett et al. study found that the subjective effects of psilocybin were found to be associated with measures of claustrum activity. The subjective effects of psychedelic drugs are characterised by alterations in attention and executive function. These shifts in attention and executive function may manifest in user-reported subjective effects of psychedelic drugs, including the difficulty of putting the experience into words (ineffability), and the potentially challenging subjective effects of depersonalisation, confusion, and paranoid delusions.

Given the association of the claustrum with executive and task-based networks and top-down control of action, the authors of the study posit that the claustrum may play a role in subjectively sensed alterations in executive function through modulation of frontal cortical regions with which the claustrum connects.

Conclusions and context

In conclusion, the study by Barrett et al. supports a possible role of the claustrum in the subjective effects of psilocybin. Moreover, the results suggest a potential mechanism for brain network alterations observed in the psychedelic state by way of aberrant claustrum activity.

Given the network disturbances that underlie neuropsychiatric disorders, including mood and substance use disorders, the broad connectivity of the claustrum indicates this nucleus may play a role in those disease states—disease states in which psilocybin has proven to exert therapeutic relief. The current study underlines the need for further efforts to examine the potential role of the claustrum in therapeutic effects of psilocybin.

Last edited:

mr peabody

Moderator: Music Discussion, PM
Staff member
Aug 31, 2016
Frostbite Falls, MN

The neuroscience of psychedelic visuals

by Dr. James Cooke | Reality Sandwich | 30 Aug 2020

‘Psychedelic’ can be applied not only to a class of chemical, but to a whole style of visual art, inspired by the visions that these chemicals produce. Why do these substances produce similar visual experiences across different individuals and where do these experiences come from? From breathing walls to melting hands, from symbolic imagery to entity encounters, psychedelic visions can tell us something about who and what we are.

During everyday waking consciousness we typically perceive a stable world of objects. This feels like a completely passive process–just open your eyes and there’s the world, no effort required. In reality, the felt simplicity of this experience masks the highly complex processing going on in your brain that allows you to see. Your retina is not a clear window through which your soul looks out onto the world. Instead it is a fleshly surface similar to the rest of your body, a piece of meat that blocks the light from the outside world, rather than letting it in.

When we see, what actually happens is that the patterns of light are transformed into electrochemical signals that are sent down the optic nerve to the brain. The brain, sitting in your pitch-black skull, learns to actively build models of the objects out there in the world that these electrochemical clues relate to. Your normal perception can be understood as a controlled hallucination, kept in check by the data coming in through your senses and by your expectations of the world around you. However, this balance between the internal creativity of the models in your brain and the extent to which they are kept in check can be altered.

Psychedelics interact with brain cells to alter their activity in ways that disrupt the normal process of perception. At low doses of a psychedelic, the visual world becomes distorted in reliable ways. Perhaps you experience trails of light following your hands as you move them, or you perceive the walls to be breathing. This can be understood as the result of a temporary impairment in the Default Mode Network (DMN).

The DMN is a group of brain areas that builds up expectations about the world and uses them to keep our perception in check. When its ability to do this is reduced, our internal models of the visual world are free to make guesses about what we’re seeing. We may overestimate the distance of the wall then correct for our error based on the sensory data. Give the newfound flexibility offered to the brain areas involved in this process, they may underestimate, then overestimate. This process can continue on and on, resulting in a pendulating interpretation of the wall’s distance and the perception of the walls breathing. The same goes for the precise position of one’s hand in space over time.

We’ve all seen shapes in the clouds or perhaps faces in the bark of a tree. Here we are using ambiguous sensory input and are finding internal models in our brains that roughly match the pattern. When on a psychedelic this process goes into overdrive and can result in us mistaking a flower for a lizard or a bowl of spaghetti for a bowl of worms. Why do we tend to see natural shapes like animals and not artificial ones? Why lizards and worms and not buildings and airplanes?

Our visual system evolved to recognize the patterns of the natural world that are relevant to survival and so, when given to opportunity to play, our visual systems show us these natural forms that we were built to detect. We have templates for snakes and spiders deeply programmed into us for example, something that simply isn’t the case for modern dangerous objects like cars and guns.

Our hallucinations can be thought of as what our brains expect to see, something that may also account for why we perceive colors to be highly saturated in the psychedelic state, as we’re experiencing the more pure template of the color than we typically experience in daily life. Understanding the basis of hallucination in our evolutionarily programmed expectation of the visual world also offers a way of understanding why eyes, serpents, and insects are so common in higher-dose psychedelic experiences, as these are highly biologically relevant patterns for us as a species.

The models in our brain that underlie perception are constructed by networks of brain cells. The vast computational power of such networks has resulted in their being imitated in modern Artificial Intelligence systems. Artificial Neural Networks such as Google’s Deep Dream can be trained to recognize different images, resulting in the construction of models in the neworks in a way that approximates learning in the brain. When we dream or hallucinate, the contents of these models become active with no sensory input to keep them in check. The same can be done with the artificial Neural Networks. When it is tasked with generating rather than recognizing images, they produce distinctly psychedelic visuals. The fact that this is the case provides striking evidence that certain psychedelic visuals are generated by the models in your brain being pushed into “create” mode.

At higher doses geometric patterns can be observed, especially when one closes one’s eyes, thereby excluding the sensory data that would otherwise keep one’s models in check. The patterns of the natural word are geometric in essence, all patterns are. Geometric hallucinations can reveal certain common patterns that are fundamental to our experience of the world. Psychedelics are not the only way to shift the visual brain into a mode where it displays their core patterns. When one “sees stars” after being hit on the head, the visuals being perceived are called phosphenes. Phosphenes can also be seen when pressure is placed on the eyeball. These geometric patterns are routinely experienced in the psychedelic state and have been observed in ancient cave art, leading to the suggestion that early cave art represents the earliest attempts of our species to carry back the experiences of the psychedelic state.

At high doses of classical psychedelics, people routinely experience ancient imagery exemplified in the art of cultures as found in ancient Egypt and Mesoamerica. Mesoamerican cultures are known to have ingested psychedelic mushrooms and it has been suggested that religion in ancient Egypt for a time revolved around the consumption of such mushroom, based on the similarity of depictions of Egyptian crowns to different stages of the development of this type of mushroom. The psychoactive blue lotus is also believed to have been consumed ritually in ancient Egypt. From our contemporary perspective, the styles depicted in the artworks feel as if they originated in these cultures. In reality the imagery of the psychedelic experience may have come first and the art later.

Another experience that can be had at high doses, especially with DMT, is the experience of visiting another “dimension”. In such an experience the person typically feels as if they have left their body behind and have been transported to another world. Understanding that our perception of the world around us is not the passive experience of a true picture of the world but instead is a controlled hallucination generated by our minds allows us to explain such experiences. As in a dream, the brain is pushed into a creative mode where it is especially decoupled from the sensory data coming from the world around. Certain contents, such as snakes and eyes, can be readily explained by the idea that they reflect deeply programmed visual patterns that are relevant to survival.

Not all of the contents fit neatly into this interpretation however, with visions of technology being particularly difficult to account for. One speculative explanation for such visions is that they reflect geometric hallucinations varying in three dimensions of space and in time.

When having the experience of moving to another dimension it is common to experience passage down a tunnel. It has been suggested by “the Godfather of LSD”, Stan Grof, that the tunnel experience is a memory of the experience of birth, although the fact that those born by cesarean section can still experience tunnels seems to rule this explanation out. Neurobiologist Jack Cowan has argued that spontaneous activity moving across the visual cortex might translate into the experience of concentric rings, as a result of the way that the retina maps onto the surface of the cortex.

Dramatic psychedelic experiences can feel like they reflect actual experiences of pre-existing phenomena that exist outside oneself, rather than being generated from within. This shouldn’t be a surprise as our perception always presents us with the feeling of an objective reality outside of ourselves, even though it is a controlled hallucination created inside us.

The nature of these experiences has led some scientists and philosophers to suggest that the material world is secondary to another reality that we interact with in these states, this ultimate reality varying from a spiritual realm, other dimensions, baseline reality in which our current world is merely a simulation, or a universal consciousness that underpins reality itself. The challenge of these perspectives, other than overturning the current prevailing paradigm, is to explain why low doses produce effects that are so readily explained by our understanding of how normal perception functions in the brain, and at high doses why these other worlds are show patterns that are biologically relevant to us, such as human eyes and natural imagery.

Psychedelic experiences can allow us to delve into our deepest personal programming and beyond into our deepest evolutionary programming, bequeathed to us by countless ancestors. Understanding the origin of psychedelic visions as coming from within does not reduce their meaning. We get the opportunity to explore ancient patterns on which our sense of meaning itself is scaffolded, we may confront aspects of Jung’s collective unconscious or realize how our visions connect us in time to our unimaginably long past as an evolved creature. Understanding how such experiences are generated, while interesting, is not necessary to derive pleasure and benefit from them. The value really is in the experience itself, whatever is going on behind the scenes.

Last edited:

mr peabody

Moderator: Music Discussion, PM
Staff member
Aug 31, 2016
Frostbite Falls, MN

The cerebellum isn’t what we thought

by Diana Kwon | The Atlantic | 11 Oct 2020

Scientists long believed its function was simply to coordinate movements. Now they suspect it could do much more.

When a 22-year-old college student turned up at a hospital after falling on ice and hitting her head, doctors conducted a CT scan that revealed a surprise: a tumor in her cerebellum, the fist-size structure at the back of the brain. After surgeons successfully removed the mass, the woman started exhibiting strange behaviors. She was emotionally unexpressive and acted inappropriately—undressing in the hospital corridors, for example. She spoke in a fast, high-pitched, unintelligible voice and had trouble doing basic arithmetic, drawing, reading, and writing. Although she began to improve after a few weeks, two years passed before she could take a remedial course through a junior college—and for more than two decades, her decision-making remained impaired.

This unusual case, which was first reported in the 1990s, defied a notion that had persisted for centuries: that the cerebellum’s job is limited to coordinating movements.
For many neuroscientists, the structure took a back seat to the cerebral cortex, the thin layer of cells covering the creased, baseball-glove-shaped lump that most of us think of when we imagine the human brain. The cerebellum was considered so unimportant that many scientists would simply ignore it in neuroimaging studies—or, when they removed animals’ brains for many types of research, they would chop the structure off and throw it away. “That’s how the field has been for a very long time,” says Krystal Parker, a neuroscientist at the University of Iowa.

Things are slowly beginning to change, however, as evidence builds that the cerebellum makes important contributions to cognition, emotion, and social behavior. On top of that, studies suggest that the cerebellum may play a key role in autism, schizophrenia, and other brain disorders. Researchers are now probing the brains of both mice and people to understand how the cerebellum contributes to these conditions.

"Investigations of the cerebellum have exploded over the last few years," says Catherine Stoodley, a neuroscientist at American University and a coauthor of a 2019 paper in the Annual Review of Neuroscience on the cerebellum’s role in cognition. “It’s very exciting.”

At first glance, the cerebellum looks a bit like a wrinkly, overgrown walnut shell. A closer look reveals two hemispheres with surface creases that sink down into deep grooves and split off into a network of coral-like branches. Peering through a microscope reveals a uniform pattern of densely packed cells. The cerebellum makes up only about 10 percent of the human brain’s mass but contains more than half of its neurons. Stretched out, the cerebellum’s surface area would be nearly 80 percent that of the cerebral cortex.

The earliest experiments with the cerebellum—Latin for “little brain”—date back centuries. Those investigations weren’t pretty: Scientists simply lopped off the structure from live animals, then observed their behavior. For example, the 19th-century French physiologist Marie-Jean-Pierre Flourens conducted cerebellectomies on pigeons and reported that the animals started to teeter and totter as if intoxicated. These findings led him to propose that the structure was necessary for coordinating motion. Clinical observations of people with cerebellar injuries later confirmed this hypothesis, cementing the cerebellum’s reputation for nearly two centuries as a movement-coordination structure.

A small number of scientists started to challenge this description in the 1980s. Lead among them was Henrietta Leiner, who had initially trained in mathematics, physics, and computer science but later took an interest in neuroanatomy. She became captivated by the cerebellum as she pondered the purpose of the thick tract of nerve fibers that connect it to the cerebral cortex.

Leiner also questioned why the cerebellum evolved to be so much larger in humans than in other animals. (According to one estimate, the human cerebellum is, on average, 2.8 times bigger than expected in primates our size.) Why would that be so, if all it did was coordinate movement? In 1986, Leiner—along with her husband, computer scientist Alan Leiner, and a neurologist named Robert Dow—proposed a radical hypothesis. The human cerebellum, they said, contributed to core thinking skills such as the ability to plan one’s actions.

Jeremy Schmahmann, then a neurology resident at Boston City Hospital, also developed a fascination for the cerebellum around that time. His interest stemmed from emerging evidence that another part of the brain once thought to be involved solely in motor control—the basal ganglia—also contributed to cognition. This led Schmahmann to wonder whether the same could be true of the cerebellum.

To address this question, Schmahmann set out on what he describes as an “archeological dig” through the stacks at Harvard’s Countway Library of Medicine. There, he discovered manuscripts dating to the 1800s documenting instances of cognitive, social, and emotional impairments in patients with cerebellar damage—and in rare cases where people were born without a cerebellum at all. “There was a little counterculture going back right to the beginning that was completely neglected,” says Schmahmann, now a neurologist at Massachusetts General Hospital and a coauthor of the recent review with Stoodley.

The historical reports persuaded Schmahmann to investigate further. In experiments with monkeys, he and his adviser, neuroanatomist Deepak Pandya, found evidence that the cerebellum receives input via the brainstem from parts of the cerebral cortex that, in the parallel areas of human brains, are involved in functions such as language, attention, and memory. “This flew in the face of accepted wisdom,” Schmahmann says. “We had some very strong opponents—but most, once the data became available, came around.”

Also around that time, another group, led by University of Pittsburgh neurobiologist Peter Strick, traced the connections going the other direction—from the cerebellum to the rest of the brain. This two-way communication bolstered the case that the cerebellum does much more than coordinate movements.

Subsequent clinical observations and neuroimaging studies have further strengthened the argument.

In the late 1990s, Schmahmann reported the first description of cerebellar cognitive affective syndrome after observing that people with cerebellar damage—due to degeneration or after tumor removal, strokes, and infection—exhibited a wide array of impairments in cognition and behavior. These included difficulties with abstract reasoning and planning, changes in personality—such as the flattened emotions and inappropriate behaviors he observed in the college student with the cerebellar tumor—and problems with speech. Some patients recovered after several months; in others, symptoms persisted for years. This condition, which was later dubbed “Schmahmann’s syndrome,” strengthened the evidence that the cerebellum was indeed involved in a variety of cognitive processes.

Rare cases of people born missing parts of their cerebellum have also hinted at broader functions. In addition to difficulty coordinating their movements, these individuals exhibit signs of Schmahmann’s syndrome, as well as autistic-like traits such as obsessive rituals and trouble understanding social cues.

In another influential study, Harvard neuroscientist Randy Buckner and his colleagues mapped communication between the cerebral cortex and the cerebellum in humans. By scanning the brains of healthy people using functional magnetic resonance imaging, the team revealed that activity in the majority of the cerebellum was in sync with activity in parts of the cerebral cortex responsible for cognitive functions—and not with cortical areas involved in movement. “That paper was incredible for showing that the majority of the cerebellum can actually be accounted for by non-motor functions,” says Ann Shinn, a psychiatrist at McLean Hospital in Massachusetts.

These studies and others are making it increasingly clear that the cerebellum has many roles. But a big question remains: What, exactly, is its overall function?

The highly organized, grid-like architecture of cells in the cerebellum has inspired some scientists to suggest that it carries out a single computation. Schmahmann has dubbed this hypothesis the “universal cerebellar transform.” Exactly which core computation could account for the cerebellum’s involvement in movement, cognition, and emotion remains an open question. But scientists have proposed a variety of possibilities, such as making and updating predictions or the precise timing of tasks.

Given the cerebellum’s myriad roles, some scientists suspect the structure may be involved in several brain-related disorders. The two conditions for which there is currently the most evidence are autism and schizophrenia.

Cerebellar abnormalities are some of the most common neuroanatomical differences seen in people with autism, and physicians have observed that injuries to the cerebellum at birth considerably increase the risk that a child will develop the condition. Recent studies also suggest that the cerebellum may have an outsize influence on development and that early irregularities in this structure may predispose people to conditions like autism.

Sam Wang, a neuroscientist at Princeton, and his team have shown that inactivating the cerebellum in mice during development using chemogenetics—a method for manipulating specific neural circuits using engineered molecules injected into the brain—leads to characteristics in the animals that mirrored those seen in humans with autism. The mice lost the preference to spend time around another mouse instead of an inanimate object, and had difficulty adjusting to a new task. The same manipulation in adult mice had no such effects.

Other researchers have found that it may be possible to modify some of these traits by targeting the cerebellum. Stoodley and her colleagues have demonstrated that stimulating the cerebellum with chemogenetics can reverse social deficits in genetically engineered mice that show autism traits. Her lab is now assessing whether they can modify social learning in both autistic and neurotypical people by targeting the cerebellum with a technique called transcranial direct current stimulation, which uses electrodes placed on the head to modulate brain activity.

The idea that the cerebellum might be involved in schizophrenia has been around for decades, but until recently there was little experimental evidence in humans. In 2019, however, a group including Schmahmann reported that stimulating the cerebellum with a method called transcranial magnetic stimulation (TMS), which uses magnets to create electrical currents in the brain, could alleviate what are known as schizophrenia’s negative symptoms, which include anhedonia (the inability to feel pleasure) and a lack of motivation. If TMS therapy proves effective, it could fulfill a long-standing need. Antipsychotic medications can successfully reduce what are known as schizophrenia’s positive symptoms—in other words, additional behaviors not typically seen in healthy people—such as hallucinations and delusional thoughts. But effective therapies for the negative symptoms remain elusive.

“There’s a lot of things we need to work out before this would become a therapeutic,” says Roscoe Brady, a psychiatrist at Boston’s Beth Israel Deaconess Medical Center who was involved in that trial. "That said," he adds, "TMS is one of the most promising options he’s seen in the published research."

Brady and his colleagues are now carrying out a follow-up study with a larger group of people. They’re also tackling the question of how, exactly, cerebellar stimulation leads to improvement. At the University of Iowa, Parker and her colleagues are also testing whether cerebellar TMS can improve mood and cognition in people with conditions including schizophrenia, autism, bipolar disorder, depression, and Parkinson’s disease. "The abnormalities in working memory, attention, and planning are very similar in many of these conditions," Parker says. Ultimately, she hopes that teasing apart the cerebellar contribution to these conditions will lead to the development of new treatments.

Whether cerebellum-based therapies can help people with these wide-ranging conditions remains to be seen. What’s clear, however, is that the cerebellum can no longer be ignored—and that its connections throughout the brain and contributions to brain function may be much broader than scientists had initially imagined.

“What I’m hoping comes out of all of this is that people can’t get away with eliminating the cerebellum from the research that they’re doing,” Parker says. “It’s almost always doing something related to whatever people are studying.”

Last edited:

mr peabody

Moderator: Music Discussion, PM
Staff member
Aug 31, 2016
Frostbite Falls, MN

To capture simultaneous and continuous measurements of the neuromodulators dopamine and serotonin,
the study authors designed a microelectrode capable of taking 10 measurements per second.

Serotonin and dopamine linked to decision-making: Study

by Amanda Heidt | The Scientist | 16 Oct 2020

In a first-of-its-kind study, researchers monitored subsecond changes in levels of the neurotransmitters in the human brain, unlocking new insight into their function.

Long associated with reward and pleasure, dopamine and serotonin may also be involved in general cognition, shaping how people perceive the world and act on those perceptions, a new study finds.

For the first time, researchers have continuously and simultaneously monitored the two neuromodulators in the human brain. The results, published October 12 in Neuron, offer new opportunities to test hypotheses previously studied mostly in animal models.

“This study isn’t just measuring dopamine and serotonin; it’s building upon the deep foundation looking at neural mechanisms for perceptual decisions in animals and humans” and linking the findings of these studies together, Tim Hanks, a neuroscientist at the University of California, Davis, who was not involved in the study, tells The Scientist.

“There’s a growing recognition that [dopamine and serotonin] have more refined and nuanced roles than what may have once been believed, and this study really makes that case clear in human decision-making,
" said Hanks.”

Both neuromodulators have been heavily studied in animals, but animals require training to carry out decision-making tasks—training that often comes with a reward. As a result, it can be difficult to tease apart the decision-making from the reinforcement they receive in return. “Animals are a limited model of the rich thoughts and behaviors that we see in humans,” says Dan Bang, a neuroscientist at University College London and the lead author of the new study.

To study dopamine and serotonin signaling in humans, the team recruited five volunteers who were set to undergo brain surgery to treat either Parkinson’s or essential tremors and agreed to have their neurochemicals monitored during the procedure. Surgeons keep patients awake during the operation and use probes to measure brain activity for safety. The research team, led by Read Montague, a neuroscientist at Virginia Tech, was able to insert its own microelectrode into the caudate nucleus of four of the volunteers and the putamen of the fifth. Both structures are regions of the striatum and are involved in movement, learning, and reward.

As they underwent surgery, each participant completed a modified version of a common visual task called the random dot motion paradigm. In each round of the task, a person was shown a cloud of flickering dots moving across a screen. Some dots moved together in the same direction, while the rest moved randomly; the proportions undergoing each type of movement determined the difficulty of the task. In the standard test, the dots disappear, and the subject must indicate whether they had, on average, been moving toward the left or the right. In the amended protocol, participants were instead shown a random angle after the dots had disappeared and had to decide whether the dots had been moving to the left or right of that angle.

"This is definitely putting the importance of dopamine and serotonin into a new light." - Ken Kishida

In this way, the scientists were able to vary the difficulty and uncertainty of a person’s perception by changing both the number of dots moving in synchrony and how close their path of motion came to the randomly selected reference angle. After making their choice, participants rated how sure they were of their decision.

A microelectrode continuously measured both dopamine and serotonin levels in the caudate nucleus or putamen, taking 10 measurements each second. Scientists have never before been able to monitor these neurotransmitters at such biologically relevant speeds in humans. Less-invasive methods such as PET scanning or fMRI typically take only one measurement per minute.

The probe used in the study is made of carbon fiber and uses low voltages to detect dopamine and serotonin activity in real time.

Within the caudate nucleus, serotonin levels were linked to uncertainty around perceptions in three of the four participants. When the task was more difficult and the outcome more uncertain, as estimated by the task variables and the participants’ self-reported uncertainty about their decisions, serotonin levels spiked shortly after the dots appeared on the screen. When the task was easier, serotonin dropped. In some previous human and animal studies, dopamine has had the opposite relationship with serotonin, and therefore with uncertainty, but in the new study, variations in the caudate nucleus’s dopamine levels did not track consistently with perceptual uncertainty.

In the putamen, however, the team did find strong evidence in support of opposing roles for dopamine and serotonin in relation to action, as evidenced by the time it took participants to make their choice about the direction of the dots. Both an increase in dopamine and a corresponding decrease in serotonin were associated with the subject’s choice to act, and both the change in neuromodulator levels and the decision itself happened more quickly when the task was easier and less uncertain.

"Taken together, these findings suggest that beyond their role as reward chemicals, dopamine and serotonin may contribute to cognition more generally, linking how we perceive the world and how we then go on to make decisions,” says Ken Kishida, a neuroscientist at the Wake Forest School of Medicine and a coauthor on the study. “This is definitely putting the importance of dopamine and serotonin into a new light.”

Even though this is a new finding in humans, it dovetails with what some researchers have begun to find in animals, says Armin Lak, a neuroscientist at the University of Oxford who was not involved in the study. In his own work, Lak has found links between dopamine and perception in rodents. “It’s really nice, for those of us working in neuroscience, to see this spectrum of studies all the way from animals to human volunteers.”

The biggest limitation of the new study, Lak adds, is the small sample size. Some of the team’s results, such as their data on the putamen, stem from only a single person. Kishida also points out that while dopamine levels varied more between people than did serotonin levels, that may be because some of the patients had Parkinson’s, a disease caused by dysregulated dopamine signaling.

Moving forward, the team plans to refine its microelectrode to recognize additional neurochemicals, such as norepinephrine. Having shown that the responses of neuromodulators can differ by brain region, they would also like to expand to include the cortex, amygdala, and hippocampus.

Better understanding of how dopamine and serotonin interact and their roles in different parts of the brain will also have important implications for treating neuropsychiatric disorders such as Parkinson’s and depression, says Hanks. Many treatments target these two modulators, but they do so across the entire brain and over longer time scales, so more knowledge could lead to more targeted and effective therapies.

“Because these neuromodulators have complex roles that depend on brain region, some will see it as a challenge, because it means that we can’t just use a medication that’s affecting [the brain] diffusely,” Hanks tells The Scientist. “But at the same time, I would argue that this represents a tremendous opportunity to make [therapies] even more effective.”

Last edited:

mr peabody

Moderator: Music Discussion, PM
Staff member
Aug 31, 2016
Frostbite Falls, MN

Scientists unlock the neurological code of dissociation

by Sarah Ratliff | LUCID News | 15 Oct 2020

A team of bioengineers from Stanford University have recreated neural activity correlated with dissociative states, such as those induced by ketamine, in the brains of mice.

Scientists are unlocking the neurological code of dissociation, a form of altered consciousness that is frequently associated with psychedelic drug use. The findings, published in Nature, point toward a possible future when psychedelic states can be created through technology rather than substances

Using data obtained from sophisticated brain imaging technology as their guide, a team of bioengineers from Stanford University were able to manipulate the activity of neurons in the brains of mice, getting them to fire in synchronized rhythms that recreate patterns correlated with dissociative states. Even more significantly, the researchers were able to recreate the same rhythmic patterns in the brain of one human test subject, who suffered from a form of epilepsy that causes dissociative episodes.

Initially, some of the mice used in the experiment were fed doses of ketamine, an anesthetic with psychedelic qualities that will cause dissociation if taken in large enough quantities. During follow-up monitoring, the researchers discovered rhythmic and coordinated firing of neurons in the retrosplenial cortex, an area of a mouse’s brain that acts as an interface for a range of cognitive functions.

“It was like pointing a telescope at a new part of the sky, and something really unexpected jumped out at us” Dr. Karl Deisseroth, a Stanford neuroscientist who participated in the project, told NPR.

Excited by the implications of this discovery, the researchers turned to a cutting-edge technology known as optogenetics, which relies on finely-tuned, precisely-aimed light beams to provoke neural responses in individual cells. Applying optogenetic techniques, they were able to replicate the neural patterns of dissociation in the brains of mice that had not been given ketamine.

Neural activity in the brain of the human subject was monitored through electrodes that had been implanted by doctors, to aid in the treatment of their epilepsy. When the patient reported dissociative symptoms, the scientists detected rhythmic oscillations in an area of the brain known as the posteromedial cortex (PMC), which is connected to self-awareness and self-reflection and is structurally analogous to the retrosplenial cortex in the brains of mice.

Once again, the scientists were able to replicate these patterns of activity artificially, this time using high-frequency electrical signals. During this procedure, the patient reported symptoms of dissociation that were identical to those produced as a side effect of the epilepsy, proving that the link between rhythmic brain patterns and dissociation were more than coincidental.

Exploring the therapeutic potential of dissociation

Rhythmic oscillations in the brain are associated with integrated consciousness, learning, and memory. They strengthen neural connections and induce more vibrant functioning at the cellular level.

Conversely, scattered or chaotic firing of neurons is a sign of dysfunction. This type of activity is associated with debilitating neurological conditions like Parkinson’s disease, schizophrenia, and epilepsy.

When someone experiences dissociation, their conscious awareness seems disconnected from their mind, body, and the surrounding environment. Dissociation represents a profound dislocation of consciousness, somewhat akin to an out-of-body experience.

But as these latest experimental findings make clear, dissociation is not synonymous with neural chaos, unlike the epilepsy that sometimes precipitates it. It is instead an alternative form of consciousness that can emerge under certain unusual circumstances, sparked by a diverse range of potential causal factors.

If experienced frequently and organically, dissociation can be a sign of mental illness. But when carefully controlled, the invocation of dissociative states can have actual therapeutic value.

Psychedelic-assisted (ketamine) therapy, which leverages the drug’s capacity to cause dissociation, has proven especially useful for the treatment of depression. The changes in consciousness caused by dissociative episodes appear to relieve the symptoms of depression within a few hours, producing strong anti-depressant effects that may last for a week or more.

A 2019 study published in the journal Science found that the therapeutic consumption of ketamine can rapidly improve the functioning of mood-related brain circuitry. The drug helps regenerate broken or frayed connections between individual neurons within these circuits, by initiating the creation of new synapses (connectors) to replace those that have been lost. Synaptic destruction is a known side effect of stress, and exposure to chronic, long-term stress is believed to play a vital causative role in the onset and continuation of depression.

Notably, when doses of ketamine are too low to cause dissociation, they don’t appear to offer the same benefits.

“There seems to be this link between dissociation and the anti-depressive effect of ketamine,” explains Dr. Ken Solt, an anesthesiologist from Harvard Medical School who helped summarize the results of Stanford University study for an accompanying article.

Ketamine use can facilitate the mind-altering rhythms that correlate with dissociation. But theoretically, any process that can spark the firing of neurons in a controlled manner, in sequence and in predictable patterns, could produce these same rhythms. Presumably, dissociative states could therefore be created on demand, eliminating the need for any chemical supplement.

The promise of Psychedelic-Assisted Therapy—and its alternative

Psychedelic-assisted therapy relies on the mind-altering effects of ketamine to cause positive changes in neural functioning. But now that scientists have discovered a way to recreate that drug’s distinctive neural signature of dissociation, it may be only a matter of time before simulated forms of psychedelic-assisted therapy will be developed, possibly using optogenetic techniques like those adopted in the Stanford study.

As of now, ketamine is used primarily to treat depression. But preliminary research into the drug’s effect on PTSD and bipolar disorder has yielded positive outcomes for these conditions as well.

Psychedelic-assisted therapy may ultimately prove beneficial for men and women suffering from a broad variety of mental health conditions. If technologically-based methodologies for creating dissociative states can duplicate those results, they could function as an attractive alternative treatment for professionals and patients who aren’t comfortable prescribing or using psychedelics.

Last edited:

mr peabody

Moderator: Music Discussion, PM
Staff member
Aug 31, 2016
Frostbite Falls, MN

Ketamine-induced brain rhythm causes dissociation*

by Lily Aleksandrova, MSc, PhD | Psychedelic Science Review | 21 Oct 2020

While ketamine produces a whole symphony of effects in the brain, surprisingly, a localized neural oscillation in a lesser-known area of the brain is enough to cause feelings of dissociation.

In the last decade, ketamine has received considerable research interest as a novel, rapid-acting antidepressant. Classically used as an anesthetic, it induces feelings of dissociation or disconnection between the mind, body, and outside world. This side effect is reported even at low sub-anesthetic doses. The neural mechanisms of how ketamine causes “out-of-body” experiences have remained elusive.

While monitoring the activity of cells throughout the brain of ketamine-treated mice, Vesuna et al. from Stanford University recently stumbled upon a specific neuronal rhythm deep in the cortex, which may hold the answer. The discovery reported in the journal Nature identified a new brain activity marker for dissociation in mice, and in a rare opportunity for scientists, also in a human. Their findings summarized below, shed more light on the neural mechanisms underlying the mysterious phenomenon of dissociation.

Unique brain wave in mice linked to dissociation following ketamine

According to Vesuna et al., when administered to mice, the dissociative drugs ketamine and phencyclidine (PCP) caused a specific population of brain cells found in layer 5 of the retrosplenial cortex, to fire at a rate of 1-3 times each second (Figure 1). The unique rhythm, which began within 2 minutes of drug injection and lasted 45 minutes, coincided with the mice experiencing a “dissociative-like” state.

Figure 1: Ketamine-induced brain rhythm causes dissociation in mice and a human subject.

The animal dissociative-like behavior was characterized by a disconnect between the perception of incoming aversive sensations, which remained intact, and more complex emotional responses to the threat, which were blunted. Namely, mice still reflexively withdrew their paw from a heat source but failed to lick the paw to cool it off as they normally would (they’re registering the sensation but don’t care about it as much).

Importantly, the ketamine-induced firing pattern caused this cortical region, which normally communicates with the rest of the brain, to become disconnected. Since the retrosplenial cortex plays a role in cognition, navigation, and episodic memories, this could explain why such functions may go “offline” during dissociation.

Next, Vesuna et al. used optogenetics, a cutting-edge technique that shines light onto a brain area of interest to control its activity. Importantly, artificially producing this brain rhythm in drug-naïve mice elicited them to act as if they were under the influence of ketamine (Figure 1). Going one step further, the authors identified a key protein found in retrosplenial neurons, a “pacemaker” ion channel called HCN1, which plays a crucial role in setting this unique rhythm. Specifically, in mice lacking the HCN1 channel, ketamine failed to induce the key oscillation and elicit a dissociation-like state.

Role of deep cortical rhythm in dissociation confirmed

Brain activity in humans can often be measured with electroencephalography (EEG) using sensors placed on the scalp. However, this brain rhythm was located deep in the cortex, requiring more invasive techniques for investigation. Luckily, Vesuna et al. gained access to a unique human volunteer. The patient, who had a form of epilepsy that caused dissociation, had electrodes implanted in the brain (for diagnostic and treatment purposes), allowing for several exciting observations.

First, self-reported pre-seizure dissociation correlated with a similar rhythm localized to the corresponding cortical region in the human brain. Below are excerpts from the patient report describing their pre-seizure dissociation experience.

I was listening to two parts of my brain speak to each other in a way that a third part of my brain, which I considered to be me, was able to listen.”

What would it feel like if someone else were to come into your head?… What I considered me shrank to this other part of me where the other parts of my brain that were talking, I stopped considering them me.”

“…where is this 3D space am I?…I took a blanket…threw it over my body, just to see, because I knew that when I don’t feel it, I don’t consider it me and immediately my legs were no longer a part of me.”

Further, the electrical stimulation of this brain area caused immediate feelings of dissociation in the patient (Figure 1). Even though it represents a single clinical observation, this experiment replicated what was observed in the mouse brain.

Putting these findings into context: Brain waves and the retrospinal cortex

Brain waves refer to rhythmic, synchronized patterns of neural activity. In humans, each of the dominant brain waves (from delta to gamma, 0.5-42Hz) is thought to correspond to a specific brain state (e.g., deep sleep, awake, deep thought, etc.). Such oscillations allow different regions to effectively communicate with each other, similar to tuning the radio frequency to increase the signal and reduce the noise.

Ketamine has been shown to cause global changes in cortical brain waves. In contrast, the dissociation oscillation identified by Vesuna et al. is localized to a very small cell population.6 In simple terms, while ketamine produces a whole symphony of effects in the brain, surprisingly, a single note played by a lesser-known instrument in the orchestra is enough to cause feelings of dissociation.

The retrosplenial cortex has been suggested to play a role in mediating the interaction between perception and memory, as well as in translating between self-centered and world-centered spatial information. While the exact functions of this brain region are still not well understood, it now appears crucial for keeping us tethered to reality.

The clinical significance of this new study

Recent research has primarily focused on identifying the key brain mechanisms responsible for ketamine’s therapeutic actions. Understanding how this drug causes its dissociative effects may facilitate the development of a new generation of safer, more selective antidepressants. However, dissociation may not just be a side effect but an integral component of ketamine’s therapeutic action.

Mammalian brains can temporarily decouple the mind and body, which may be an evolutionarily adaptive mechanism (e.g., during trauma). Ketamine not only hijacks this mechanism, but the level of dissociation reported predicts a more robust and sustained antidepressant response. It is not hard to imagine that being temporarily forced to dissociate from those rigid, negative, and maladaptive beliefs about oneself and the world would be beneficial for patients suffering from depression.

Whether the neural mechanisms identified by Vesuna et al. apply to the mind-altering/out-of-body effects of classical psychedelics (which act through a different brain receptor than ketamine) remains an open question.

Last edited:

mr peabody

Moderator: Music Discussion, PM
Staff member
Aug 31, 2016
Frostbite Falls, MN

What is the Default Mode Network?

by Sabrina Eisenberg, MS | Psychedelic Science Review | 18 Nov 2020

A default level of brain activity sheds light on the source of consciousness and mechanisms of ego death.

What happens when a person lays down and draws their attention away from the outside world? They might think of this as a ‘resting state’ in which neural activity decreases. Researchers considered an alternate proposal upon noticing mental activity in consistent, task-independent areas during the ‘resting state’, activity which was absent during goal-directed behavior. Raichle and colleagues referred to these areas as part of the Default Mode Network (DMN), reflecting the presence of a baseline, or ‘default’ level, of neural activity. The following article will discuss the DMN, its relationship to psychedelics, and its role in current and future research.

Oxygen Extraction Factor and DMN

Researchers used the oxygen extraction factor (OEF) to assert the DMN’s existence in the brain. The OEF is the ratio of oxygen consumed from the blood to local oxygen availability (by means of blood flow). Whereas blood flow changes significantly depending on activity, oxygen consumption remains at a nearly constant level. A decrease in OEF represents an increase in blood flow and vice versa. Rather than relying solely on blood flow levels to determine neural activity, the OEF allows for a more comprehensive view of the brain’s energy dynamics.

The OEF in the default mode, or the mean OEF, is typically uniform and constant across brain areas regardless of spatial variances. A decrease in OEF from the mean represents an activation relative to the baseline of neuronal activity, whereas an increase represents a deactivation.

Location and purpose of the DMN

Key areas associated with the DMN are the medial prefrontal cortex (mPFC), posterior cingulate cortex (PCC) and precuneus, inferior parietal lobule, lateral temporal cortex (LTC), and hippocampal formation. These areas confirm the DMN’s association with emotion and memory centers, rather than the sensorimotor cortex. These associations are unsurprising, given the correlation of the DMN with introspection, autobiographical memory, daydreaming, and future envisionment. The high levels of neural connectivity in these regions signify their importance as hubs of information transfer.

Reframing the initial distinction between the DMN and other areas as intrinsic versus evoked activity, rather than rest versus task, distinguishes the DMN as a communication hub. The large proportion of the brain’s energy budget devoted to functional activity, nearing 90%, underscores the importance of intrinsic activity at baseline. Theories propounded to explain the afforded budget include the DMN as an adaptation serving to gather pertinent external information, a necessary tool for future planning, and a foundation for our sense of self.

Understanding how the DMN reacts to self-referential thought processes, rumination, and awareness demonstrates its role as an essential tool in comprehending biological mechanisms for the subjective experience of various psychological states and mental illnesses. Increased activity has been shown in the DMN of patients with schizophrenia, depression, and social phobia, but reduced in autism, Alzheimer’s disease, during hypnosis, meditative states.

DMN and Psychedelics
It appears that when activity in the default mode network falls off precipitously, the ego temporarily vanishes, and the usual boundaries we experience between self and world, subject and object, all melt away.” – Michael Pollan

The DMN cannot be discussed without considering its influence in the field of psychedelic research. When Carhart-Harris and colleagues originally studied the brain on psilocybin, they expected a flourish of activity where they instead found a significant drop-off, represented by decreased blood flow. Recalling from the earlier discussion of OEF, decreased blood flow indicates an increase in OEF, and, consequently, a deactivation from baseline. Concurrently, a decrease in positive coupling between the PCC and mPFC demonstrated possible evidence for a restructuring of the standard hierarchical model of neuronal activity.

Carhart-Harris returned to the idea of a restructured neural model in his theory of entropy and the return to the “primary state.” He theorized that psychedelics instigate highly disordered states, or states of high-entropy, by facilitating the collapse of the normally organized DMN and a decoupling between the DMN and medial temporal lobe (Figure 1). This returns the brain to a regressive, unconstrained state of cognition, or “primary state,” allowing exploration into latent thought and an insight into the unconscious mind.

Unconstrained thought can result in unexpected connections between brain networks. Considering the DMN’s association with metacognition and self-relevant thought, it is not difficult to see the link between a drop-off in the DMN’s activity and a phenomenon such as ego death, which is frequently experienced under the influence of psilocybin.

Figure 1: Comparison of structural connectivity between neural nodes in the normal state (a)
and the psilocybin state (b).15 This is an example of psilocybin use resulting in high entropy
(disorder) and reorganization of the brain as it forms new connections.

Not only psilocybin but LSD and ayahuasca decrease the integrity of and activity in areas of the DMN, correlating with ego-dissolution and altered consciousness. Decreased oscillatory power and desynchronization in DMN areas, and a more recently proposed association between the DMN and the claustrum, represent additional mechanisms linking psychedelics and ego dissolution.

Aside from consciousness, the DMN offers a basis to postulate the biological mechanism for psilocybin’s antidepressant effects. Two possible explanations are the serotonin 5-HT2A receptor and resting-state functional connectivity (RSFC). Following suit from the theme of positive effects of disintegration, decreased activity in the mPFC via 5-HT2A receptor stimulation may counteract the brooding, pessimism, and depression associated with an overactive mPFC.

Counter to this logic, increased DMN RSFC one-day post psilocybin treatment was predictive of later treatment response. Carhart-Harris likened this response to a ‘reset’ or normalization of acutely lowered RSFC similar to that seen after treatment with electro-convulsion therapy (ECT). This theory, and the research upon which it is speculated, is not entirely consistent within the literature and would benefit from future research along its lines.


The DMN’s suggested role in self-referential thought can further the study of psychedelics, the understanding of mental illness and psychological states, and the interplay between the two. Examples of this interaction are the study of the DMN as an explanation for the effect of psychedelics on increased exercise performance, and the study of psilocybin-assisted mindfulness training modulating self-consciousness and DMN connectivity with lasting effects.

This serendipitous discovery led to and continues to shed light on, insights about consciousness and the mechanisms through which psychedelics influence brain chemistry and the relevant conceptualization of the ego. Whether explaining psychedelics as a therapeutic agent or garnering basic knowledge about psychological states and conditions, the DMN has proven useful to researchers across many domains.

Last edited: