• N&PD Moderators: Skorpio | thegreenhand

Calcium channel blockers for reduction of tolerance to Amphetamines & addiction

Lightning-Nl

Bluelighter
Joined
Nov 11, 2012
Messages
1,245
Due to the synergistic effects that I have experienced by taking Lamotrigine and Vyvanse at the sametime, I came up with a fact-based explanation as to why Lamotrigine would synergise with Vyvanse. I am asking for input on whether or not you think I'm right, what I missed if anything, etc. This is all just harmless speculation, but interesting none-the-less.

This is about using Calcium channel blockers as a means of blocking homeostasis from being achieved by NMDA neurons, thus slowing or reducing tolerance to Amphetamine and opiates. I am going to use Lamotrigine as an example due to the fact that I experienced this with Lamotrigine.

Lamotrigine is an antiepileptic drug (used to stop seizures). It does this by blocking sodium channels, and to a lesser extent, calcium channels. This inhibits neurons in a literal sense. A lot drugs inhibit excitation by antagonizing a receptor for a neurotransmitter that would cause excitation. Some drugs enhance GABAergic activity which causes inhibition all over the body. But Lamotrigine works but literally blocking the electrical signal to another neuron.

When a neurotransmitter binds to a neuron, it sends the message to either speed-up neuronal firing or slow down neuronal firing. To give an example, when Glutamate binds to a neuron (I'm using Glutamate as an example because it's general purpose is to cause excitation) it causes depolarization of the neuron. This causes a change in a cell's membrane potential that makes it more positive. This rectifies the cell which opens the channel, allowing positively charged ions to move into the cell. Once the mV's inside the cell becomes higher than the charge on the outside of the cell, the cell becomes positively charged, and the outside of the cell becomes negatively charged. Since opposites attract, the now positive charge of the cell makes the positively charged ions want to move out of the cell. However, the activation gate is voltage-sensitive. So it blocks the channel before the ions can leave.

This is what causes the neuron to fire, a positive charge inside the cell. The GABA system works by moving negatively charged Chloride ions inside the cell. If the charge of the inside of the cell is greater than the outside (meaning it's positive on the inside and negative on the outside) this makes Cl- "want" to move into the cell. Because the charge is negative, the activation-gate is attracted towards the inside of the cell which "unplugs" the channel thus allowing Cl- into the cell. This equalizes the charge of the inside of the cell to that of the outside of cell, this making the cell very less excitable.

It's believed that this is how Gabapentin and Pregabalin work. In a way unrelated to the mechanism of action of Benzodiazepines, Gabapentin/Pregabalin cause Cl- to become more freely available (or may even mimic Cl-) which negatively charges the outside of the cell.

Lamotrigine works by "mimicking" the activation gate, because Lamotrigine itself is voltage-gated. Lamotrigine works by blocking the outside of the cell membrane. But it doesn't block the channel, when there's no charge. So when there's no charge, it freely floats around. When one side of the cell is negatively charged, the ions want to move into or out of the cell. Lamotrigine does too, but it's "too big" to fit inside the cell, so when it tried to get in, it gets stuck. Thus blocking the channel.

That means, if the cell is negatively charged on the inside, Lamotrigine will try to move into the cell, but it can't. So it blocks the cell, thus not allowing the cell to become positively charged.

Lamotrigine does this at the sodium channels and, to a lesser extent, calcium channel (this is important)


Lamotrigine and Glutamate

So what does this have to do with tolerance reduction? Well, Lamotrigine blocks excitation of a cell, whether or not a neurotransmitter has "told" the cell to depolarize. This means, the cell has no action potential, so it doesn't fire. Because the message has to be sent to the next neuron that it too, should fire - the firing of that cell would normally release an excitatory neurotransmitters on the other side (usually Glutamate). Because Lamotrigine doesn't allow the neuron to fire, the action potential is broken.

Because Glutamate is the most abundant excitatory neurotransmitter in the body, action-potential is most often created by Glutamate binding. Because Lamotrigine doesn't allow the action-potential to develop, even though there might be sufficient Glutamate, the ions can't change the charge of the cell. Therefore, the message is not sent down the axon and Glutamate transmission is disrupted. This leads to a body wide reduction in Glutamate levels.


Tolerance reduction

It's been discovered in the last decade that tolerance to Amphetamines (but all stimulants apply) and tolerance to opioid drugs are both caused by increased Glutamatergic activity. Specifically, increased sensitivity at the N-Methyl-D-Aspartate (NMDA) receptor. This increased sensitivity leads to much larger amounts of Calcium ions getting into the cell. This makes Glutamate more likely to cause an action-potential and therefore, neuron firing will occur much more often, for a much larger period of time.

Lamotrigine, or any calcium channel blocker therefore, indirectly modifies this channel. Not only does it block Calcium ions from not being able to charge a cell, but this blocking of calcium causes decreased downstream release of Glutamate since the single doesn't need to be sent. This means, less Glutamate will be modulating NMDA. This will block the body from being able to reach homeostasis when modulating bodily function due to amphetamine's presence.


Role in addiction

According to this study, NMDA modulation in dopaminergic neurons is essential to dopamine's role habit-formation. Knock-out mice were impaired when forming-habits when NMDA receptors did not exist on dopaminergic neurons. This leads to the assumptions that Glutamatergic activation is essential to habit-forming behaviors. So, when dopamine neuronal firing is increased, to desensitize these dopaminergic neurons, the NMDA receptor starts firing faster and more often.

Not only does this downregulate dopamine, but when Amphetamine is then - not present, the increased levels of dopamine that it would have produced aren't there. But NMDA is still firing faster and stronger than it should. This leads to the need to execute that habit in order to reach homeostasis. By blocking this receptor from activating, the dopamine neurons will upregulate again and eventually, NMDA won't be hyperactive.

I know this is more specific to NMDA antagonists, but since NMDA firing is Calcium ions moving down the axon - if calcium can't get inside the cell to allow NMDA to fire, wouldn't that have the same effects as an antagonist?

http://www.ncbi.nlm.nih.gov/pubmed/22196339

What do you think?

Now I'm curios as to what everyone else thinks about this. This makes a lot of sense to me, however, the internet loves to point out flaws in your thinking. So is there anything I missed that wouldn't allow this to work? What's you're guys opinion?

Also, do you think Calcium channel blockers can be a means of treating addiction in the future?
 
Verapamil is a calcium channel blocker right ? I was scripted that for 3 years for high blood pressure and I had a major freebase (i'd make it myself from "fishscale") thing going on, but it never was a problem to just outright stop and do something else with my time when I got mad at all the money I put on it. It also made it so I never ever got chest pain/being scared that my heart was gonna explode like so many other people have claimed. I was also shooting up Dilaudid one weekend (but intensely) a month then for a whole year during the time I was using verapamil and I really didn't slip nor fall into reckless addiction. In fact, a lot of things started to suck for me after stopping to use verapamil. You might be on to something sir. (if verapamil is anything like the drugs you described).
 
I was also shooting up Dilaudid one weekend (but intensely) a month then for a whole year during the time I was using verapamil and I really didn't slip nor fall into reckless addiction. In fact, a lot of things started to suck for me after stopping to use verapamil. You might be on to something sir. (if verapamil is anything like the drugs you described).

That's very interesting indeed.

You can't base a scientific study off of one user report, however, I've read similar things from other users of opiates here on Bluelight and on Drug Forum. My post was meant to purely reflect the possibility of using calcium channel blockers as a means of lowering tolerance and treating an already established addiction. But now that you say that, it makes me extremely curios...

If NMDA can't modulate itself in the first place, would an addiction even occur? More than likely, physical addiction would still be a problem. However, without Glutamate influencing Dopamine reinforced behavior, would you even become psychologically addicted in the first place?

Although, using a calcium channel blocker is kind of a really indirect way at modulating NMDA. I wonder if co-administration with an NMDA antagonist and a calcium channel blocker would produce even more powerful inhibiting effects on the formation of habits. This also makes me wonder if this could be used as a means of treating Obsessions and Compulsions.....
 
Last edited:
Lamotrigine, or any calcium channel blocker therefore, indirectly modifies this channel. Not only does it block Calcium ions from not being able to charge a cell, but this blocking of calcium causes decreased downstream release of Glutamate since the single doesn't need to be sent. This means, less Glutamate will be modulating NMDA. This will block the body from being able to reach homeostasis when modulating bodily function due to amphetamine's presence.

The Wiki article suggest lamotrigine has no, or minimal effects on NMDA. More importantly, a generalised decrease on Glu release (especially an indirect one) does not 'single out' the NMDA receptor in the same way that NMDA-antagonists do, c.f. why nobody uses sodium channel blockers like lamotrigine, phenytoin, etc for treatment resistant depression. Remember, there's a lot more to glutamate than NMDA... also the mGluR's and such. You must remember, the brain is very intricately balanced. There's much more to keep track of than a fistful of chemicals, and it's not always easy to go from a paper argument to the reality of the (grey) matter.

Lamotrigine works by "mimicking" the activation gate, because Lamotrigine itself is voltage-gated.

Compounds aren't voltage-gated, the ion channels are. Your explanation of the electrical processes of a cell, while wordy, is pretty much right, though.

This leads to the assumptions that Glutamatergic activation is essential to habit-forming behaviors. So, when dopamine neuronal firing is increased, to desensitize these dopaminergic neurons, the NMDA receptor starts firing faster and more often.

Not only does this downregulate dopamine, but when Amphetamine is then - not present, the increased levels of dopamine that it would have produced aren't there. But NMDA is still firing faster and stronger than it should. This leads to the need to execute that habit in order to reach homeostasis. By blocking this receptor from activating, the dopamine neurons will upregulate again and eventually, NMDA won't be hyperactive.

This is overly simplistic, I think. Nobody really talks about NMDAr blockade reversing tolerance, either, just preventing the development of it in the first place. Your theory also fails to account for why NMDAr blockade is said to also help with, e.g., opioid therapy.

If NMDA can't modulate itself in the first place, would an addiction even occur?

Why do people get addicted to NMDA antagonists?

Although, using a calcium channel blocker is kind of a really indirect way at modulating NMDA.

One might say it doesn't modulate NMDA.
 
The Wiki article suggest lamotrigine has no, or minimal effects on NMDA. More importantly, a generalised decrease on Glu release (especially an indirect one) does not 'single out' the NMDA receptor in the same way that NMDA-antagonists do, c.f. why nobody uses sodium channel blockers like lamotrigine, phenytoin, etc for treatment resistant depression.

All of that is understood. However, it's not really direct effects on NDMAr I'm getting at here. I know Lamotrigine's only direct effect on NMDA would be if it just happened to block an NMDA receptor/neuron's action-potential and inhibited it from firing, but that's not what I was referring to. Since Lamotrigine widely inhibits the release of Glutamate, it stands to reason that with less universal Glutamate floating around, there would be less Glutamate binding to NMDAr no?

Also, according to Wikipedia - agonism of mGluR (except mGluR2, mGluR3 and mGluR5) increases firing of NMDAr through downstream activity. If Lamotrigine inhibits overall Glutamatergic transmission - it will, therefore, directly and indirectly modulate NMDA receptors through decreased levels of Glutamate and decreased firing of MGluR (which is sort of the same thing as decreased Glutamate - assuming the neuron is a Glutamatergic neuron and not a Glutamatergic receptor on a neuron)

Remember, there's a lot more to glutamate than NMDA... also the mGluR's and such. You must remember, the brain is very intricately balanced. There's much more to keep track of than a fistful of chemicals, and it's not always easy to go from a paper argument to the reality of the (grey) matter.

Again, understood. I'm well aware of AMPA, Kainate and Metabotropic receptors, they all have different downstream effects. I know that different systems in the brain and CNS balance each other out. I know it can be more than just chemcials, but to give an example; ACh agonism causes downstream Histamine release, but Histamine agonism inhibites ACh release. However, I was talking about NMDAr specifically.

sekio said:
Compounds aren't voltage-gated, the ion channels are. Your explanation of the electrical processes of a cell, while wordy, is pretty much right, though.

I feel like that's something I know, but I had to hear it in order to "know it" if you know what I mean. Anyways understood.

sekio said:
This is overly simplistic, I think. Nobody really talks about NMDAr blockade reversing tolerance, either, just preventing the development of it in the first place. Your theory also fails to account for why NMDAr blockade is said to also help with, e.g., opioid therapy.

Oh, have you not seen the studies? I kinda assumed everyone had (that sounds sarcastic but it wasn't meant to be). In a similar way, NMDA antagonists were seen to slow the development of tolerance to Opioids in a similar fashion as Amphetamines. Again, it probably has something to do with Glutaminergic receptors on Dopamine neurons.

sekio said:
Why do people get addicted to NMDA antagonists?

I get your point. However, I believe it's because downregulation of NMDA causes enhanced Dopaminergic transmission (which means it probably has some effect on all monoamines through downstream transmission and such). Dopamine release is a good indicator of an addictive substance, etc... However....

Lets say you're taking DXM for tolerance reduction/slowing of tolerance to Amphetamine. The reason why you have to take the NMDA antagonist in the first place is due to the fact that artifical increased Dopamine agonism (caused by introduction of Amphetamine) is upramping NMDA firing. This upregulates Glutamate transmission in general.

You said it yourself - because Amphetamine is releasing Dopamine, thus causing Dopaminergic transmission - NMDA will start firing faster in response to these heightened levels of Dopamine in order to reach homeostasis, this would normally cause tolerance and it's believe to also cause addiction (in this case). But when the NMDA antagonist is introduced, it greatly slows NMDAr firing, thus slowing the body's ability to become tolerant to Amphetamine and therefore, probably increasing the amount of time it takes to become addicted. Therefore, the body can't reach homeostasis because two artificial substances (that the body has no control over) are both influencing bodily function.

sekio said:
One might say it doesn't modulate NMDA.

I'm not sure how one can say that with any certainty, however, I will admit that it may be negligible in strength. But I don't believe so....
 
it stands to reason that with less universal Glutamate floating around, there would be less Glutamate binding to NMDAr no?

Maybe. It doesn't mean that NMDA won't upregulate in response. For instance, with SSRI therapy, on paper one would expect to see permanent elevations in the amount of postsynaptic serotonin, but instead in the long term one sees a state of homeostasis rather than excess 5ht.

Oh, have you not seen the studies? I kinda assumed everyone had (that sounds sarcastic but it wasn't meant to be). In a similar way, NMDA antagonists were seen to slow the development of tolerance to Opioids in a similar fashion as Amphetamines. Again, it probably has something to do with Glutaminergic receptors on Dopamine neurons.

Where is there evidence that, after someone has already formed a huge tolerance to amphetamine, that taking NMDAr antagonists is going to bring them to baseline? I don't see a lot of evidence for that.
 
Where is there evidence that, after someone has already formed a huge tolerance to amphetamine, that taking NMDAr antagonists is going to bring them to baseline? I don't see a lot of evidence for that.

This is a subconscious mistake I keep making. I mean to say that it reduces the buildup of tolerance to Amphetamines/Opioids when they are in use.

When they aren't in use, it seems to me that the extensive studies that have been on dissociatives is all that is really needed to prove that taking an NMDA antagonist will assist lowering tolerance with time. That is, if I am thinking of it in the correct way. Yes - NMDA will desensitize in response to the antagonist, however, antagonization of NMDA indirectly allows Dopamine to upregulate itself.

Because Dopamine is upregulating itself due to the decreased NMDAr activity, this actually causes NMDAr to downregulate on Dopaminergic neurons even more. Sure, it may become less sensitive to antagonists, however, the now upregulated Dopamine can keep it in check itself.

Granted, dopamine will become desensitized to increased amounts of Dopamine eventually as well, but by that time, NMDAr has already reached homestasis again and Dopamine regulation is back to normal as well.
 
Last edited:
Where is there evidence that, after someone has already formed a huge tolerance to amphetamine, that taking NMDAr antagonists is going to bring them to baseline? I don't see a lot of evidence for that.

There has been a lot of success using NMDA antagonists after cessation of use to rapidly reduce tolerance though. Do you have any explanations for this? Its a curious thing. I've had it happen to me with both memantine and DXM in regards to alcohol tolerance (abolished to the point that after two weeks of abstinence, a single beer gave a mild buzz - previously this effect would not be achieved until at least 6 beers had been consumed) and memantine reset MDMA and phenethylamine tolerance as well.

Upon cessation of the antagonist, tolerance development starts building again but it seems to stay low as long as a maintenance dose of the antagonist is taken.

If I could stomach the idea of full-blown benzo withdrawal, I would stop taking my Valium and run tests to see if Delsym rapidly dropped benzo tolerance as well. I am currently using it rather successfully to taper without any of the common physical symptoms though and it also curiously abolished any and all alcohol and tobacco cravings.
 
There has been a lot of success using NMDA antagonists after cessation of use to rapidly reduce tolerance though. Do you have any explanations for this? Its a curious thing. I've had it happen to me with both memantine and DXM in regards to alcohol tolerance (abolished to the point that after two weeks of abstinence, a single beer gave a mild buzz - previously this effect would not be achieved until at least 6 beers had been consumed) and memantine reset MDMA and phenethylamine tolerance as well.

Upon cessation of the antagonist, tolerance development starts building again but it seems to stay low as long as a maintenance dose of the antagonist is taken.

If I could stomach the idea of full-blown benzo withdrawal, I would stop taking my Valium and run tests to see if Delsym rapidly dropped benzo tolerance as well. I am currently using it rather successfully to taper without any of the common physical symptoms though and it also curiously abolished any and all alcohol and tobacco cravings.

It's believed all addicting drugs modulate the dopamine system in some way. I guess this would indirectly increase NMDAr activity - however, Benzodiazepines do not directly modulate Dopamine like Phenethylamines-derivatives and Opioids do. Therefore, taking an NMDA antagonist would do nothing for reducing Benzodiazepine tolerance.

Ethanol is a direct NMDA antagonist itself. Whatever "tolerance reduction" you felt from using DXM before Alcohol was purely placebo. In fact taking DXM would (in theory) increase tolerance to Alcohol due to the fact that NMDAr would desensitize to antagonists. It's possible you could have felt additive effects by combined DXM and Ethanol, that's more than likely what happened.
 
Ethanol is a direct NMDA antagonist itself. Whatever "tolerance reduction" you felt from using DXM before Alcohol was purely placebo. In fact taking DXM would (in theory) increase tolerance to Alcohol due to the fact that NMDAr would desensitize to antagonists. It's possible you could have felt additive effects by combined DXM and Ethanol, that's more than likely what happened.

Antagonists block ligands and agonists from binding, simply by occupying the binding site, and can cause upregulation, correct? So how do you draw that conclusion?
 
Antagonists block ligands and agonists from binding, simply by occupying the binding site, and can cause upregulation, correct? So how do you draw that conclusion?

No, not all antagonists work that way.

Antagonists that block the binding sites directly, are known as "competitive antagonists." They get this name due to the fact that they "compete" with the endogenous ligands for binding affinity. They build tolerance in the same way agonists do. Eventually, the neuron will become "desensitized" to the antagonist. As time goes on (and as tolerance to the antagonist builds up), higher concentrations of the ligand will bind to the receptors. This causes the endogenous ligand to "override" the competitive antagonist and agonize the receptor anyways. You can also "override" the antagonist by simply increasing levels of the endogenous ligand.

The term non-competitive antagonists are used to describe two types of antagonists. But the term is mutual to the two because the end result is still the same. These types of antagonists bind to the active site, or the allosteric site. These antagonists are known as "non-competitive because no matter how high the concentration of the agonizing ligand is - they still can't agonize the receptor.

Competitive antagonists cause the endogenous ligand to bind in higher quantities in order to be able to achieve the maximal response. Non-competitive antagonists reduce the magnitude of the maximum response that can be attained by any amount of the agonist. This explains how they get the name "non-competitive antagonist" - their effects cannot be negated. But as tolerance builds up, the neurons will adapt to the antagonist and in time, the effects that the non-competitive antagonist used to have at being able to block the maximum response of the agonist will be lessened. As neurons adapts, they will therefore upramp the maximum response that an agonist can produce.

Inverse agonists can be competitive or non-competitive in nature. However, they're known as inverse agonists because they produce a distinct response of other neurons through downstream activity.

Irreversible antagonists are antagonists that cause antagonization of that receptor forever. They may have competitive or non-competitive effects at the receptor, but one very distinct response occurs. Normally, a receptor will "unbind" from the binding ligand and it will "move on." The way the body stops the endogenous ligand's from activating the receptors again, is through uptake. However, when the receptor "tries" to unbind from the irreversible antagonist, it can't. This, in a sense, destroys the receptor - nothing can ever bind to it again.

Neutral antagonists are antagonists that don't have a response one way or another. They don't produce typical agonist or antagonist effects. Instead they literally block any other ligand from being able to bind to it. This, in a sense, produces an antagonistic effect because less agonist can bind to receptors. You would think this wouldn't affect tolerance, but the neuron adapts in the same way mentioned above.
 
So, essentially, the brain can become desensitized/build tolerance to non-competitive antagonists. Fascinating. I knew I needed a more recent medical chem textbook.

Sorry for my inexpertise, I'll go back to lurking in the corner now!
 
So, essentially, the brain can become desensitized/build tolerance to non-competitive antagonists. Fascinating. I knew I needed a more recent medical chem textbook.

Sorry for my inexpertise, I'll go back to lurking in the corner now!

No problem :)

All it takes is time to accumulate advanced knowledge of pharmacology. I found that reading the Wikipedia summaries is a great way to get a basic understanding of how the body works in relation to effects of drugs and endogenous chemicals on the central nervous system. However, if you're really as passionate about the subject as I am, I recommend reading studies, and books on the subject. This is a great way to gain a more advanced understanding of the inner workings of drugs and how they effect the central nervous system
 
It's believed all addicting drugs modulate the dopamine system in some way. I guess this would indirectly increase NMDAr activity - however, Benzodiazepines do not directly modulate Dopamine like Phenethylamines-derivatives and Opioids do. Therefore, taking an NMDA antagonist would do nothing for reducing Benzodiazepine tolerance.

I don't use NMDA antagonists for tolerance, only to ward off most of the withdrawal symptoms. Even my psychiatrist is aware of NMDA and AMPA antagonism almost completely abolishing physical withdrawal symptoms from benzodiazepines and she prescribes memantine when insurance will cover it (mine unfortunately will not) and she has over an 80% success rate with memantine alone and she estimates about 90% when she adds topiramate to the mix. She also said that if her patients kept med and side effect/withdrawal logs such as myself that the success rate would certainly be higher.

Ethanol is a direct NMDA antagonist itself. Whatever "tolerance reduction" you felt from using DXM before Alcohol was purely placebo. In fact taking DXM would (in theory) increase tolerance to Alcohol due to the fact that NMDAr would desensitize to antagonists. It's possible you could have felt additive effects by combined DXM and Ethanol, that's more than likely what happened

Ethanol is not an NMDA antagonist in low concentrations. One beer would certainly be considered a low concentration. Ethanol IS dopaminergic however which at least would partially explain said tolerance reduction. This was not some perceived minor tolerance reduction - I have always had a very high tolerance to alcohol, probably due to my blood line being a long line of alcoholics, myself an alcoholic. One beer never touched me even at the age of 7.

The argument about NMDAr desensitizing to NMDA antagonists - in high doses definitely. Neither memantine nor dextr(ometh)orphan have shown tolerance in low doses. I highly doubt the brain does anything about it until normal functioning is being seriously inhibited.

The DXM argument would be plausible if it wasn't for the fact that 60mg of Dextromethorphan a day is nothing and that the effect was initially noticed on memantine. In fact on memantine, alcohol exhibited a clean euphoria that it never had before, not unlike high dose Kava.

It is not like I'm the only one observing these effects either. It is a common thing.

Anyway, this is totally off-topic, I was just curious if the mechanics were completely understood and it would appear that there is still a lot of research left to do in this area. Carry on everybody.
 
I don't use NMDA antagonists for tolerance, only to ward off most of the withdrawal symptoms. Even my psychiatrist is aware of NMDA and AMPA antagonism almost completely abolishing physical withdrawal symptoms from benzodiazepines and she prescribes memantine when insurance will cover it (mine unfortunately will not) and she has over an 80% success rate with memantine alone and she estimates about 90% when she adds topiramate to the mix. She also said that if her patients kept med and side effect/withdrawal logs such as myself that the success rate would certainly be higher.

Links to studies, perhaps?

You cannot conclude any information from just ONE user report or based on the fact that your psychiatrist has had an "80% success rate" with the combination. Unless you can provide solid proof to back that up (meaning studies that have been replicated more than once) can anyone take this with more than a grain of salt. Anything is possible, however, in order for the science community to even acknowledge there is something to this - there HAS to be facts to back it up.

Ethanol is not an NMDA antagonist in low concentrations.

Proof? In my 15+ hours of researching Ethanol's pharmacodynamics, I've never heard this being stated. I'd love to see this personally. It would greatly enhance our understanding of Ethanol's effects in the body.

This was not some perceived minor tolerance reduction - I have always had a very high tolerance to alcohol, probably due to my blood line being a long line of alcoholics, myself an alcoholic. One beer never touched me even at the age of 7.

Perceived is the key word here. As I stated above, no one can conclude anything from one user report. Also, what does "alcoholics being in you blood-line" have to do with tolerance? Tolerance is not addiction. You can be tolerant to a drug and not be addicted to it. Also, just because your family has a history of alcoholism doesn't mean you will automatically be tolerant to alcohol - that doesn't even make sense.

You may be more susceptible to becoming dependant on it, however, you will get the same effects out of Ethanol just like everyone else. No one is automatically tolerant to any type of drugs.

Ethanol IS dopaminergic however which at least would partially explain said tolerance reduction.

No. This is only true if Ethanol has effects on Dopamine

There is evidence to suggest that Ethanol raises Dopamine levels, however it's effects on raising Dopamine levels are very poorly understood at this time. One theory believes that Ethanol may inhibit the circulating enzymes that are responsible for breaking down Dopamine (however the enzyme listed in that particular study is the same enzyme that has effects on breaking down all Monoamines).

Another theory is based around Acetaldehyde. There is evidence to suggest that Ethanol does not raise Dopamine levels itself, but the major metabolite of Ethanol (acetaldehyde) does raise Dopaminergic transmission in some way.

Either way, neither of the above have been confirmed. This requires much more research before anyone can claim that Ethanol does, infact, raise dopamine levels.

The argument about NMDAr desensitizing to NMDA antagonists - in high doses definitely. Neither memantine nor dextr(ometh)orphan have shown tolerance in low doses. I highly doubt the brain does anything about it until normal functioning is being seriously inhibited.

This is a seriously misinformed statement.

All drugs cause tolerance. Period. It doesn't matter what dose you're taking, eventually that dose will not be effective because neurons in the brain have adapted. You will have to keep raising your dose as time goes on.

The DXM argument would be plausible if it wasn't for the fact that 60mg of Dextromethorphan a day is nothing and that the effect was initially noticed on memantine.

Ugh....no.

Dextromethorphan does produce effects at 60MG's! If it wasn't active at that dose, then why would they indicate that dose for cough suppression? Being active and being recreational are two separate things. Lower doses of DXM are in fact active and it does in fact produce effects - cough suppression. Just because you don't "get high" off of lower doses doesn't mean the drug isn't active.

In fact on memantine, alcohol exhibited a clean euphoria that it never had before, not unlike high dose Kava. It is not like I'm the only one observing these effects either. It is a common thing.

There's no need for another answer, I'll just quote what I said above.

"Links to studies, perhaps?

You cannot conclude any information from just ONE user report or based on the fact that your psychiatrist has had an "80% success rate" with the combination. Unless you can provide solid proof to back that up (meaning studies that have been replicated more than once) can anyone take this with more than a grain of salt. Anything is possible, however, in order for the science community to even acknowledge there is something to this - there HAS to be facts to back it up."

If this is "really a common thing" then prove it. Carry out the studies and bring back solid proof that this is what happens. Until then, the dump I took this morning has the same meaning as any user reports.

Anyway, this is totally off-topic, I was just curious if the mechanics were completely understood and it would appear that there is still a lot of research left to do in this area. Carry on everybody.

Seriously?

All you provided as backup for your argument is user reports and assertions. How can you sit here and criticize me when you have a very limited understanding of this information? And then you have the gaul to sit here and tell me "it's not completely understood" when you tried to assert facts that aren't actually facts.

I try not to insult people in public, but was just, plain, wrong.
 
I don't have access to all of the papers and such nor do I want them. You're talking about things as if they are even fully understood by science which they clearly aren't so who knows, we both could be wrong. I don't remember where I read that alcohol was not a potent NMDA antagonist in low concentrations but given the multiple levels of intoxication that alcohol causes, it is definitely plausible.

When it comes to ACTIVE DOSE, by your logic, any dose of anything is active which makes this whole argument moot. 60mg released slowly over a 24 hour period is not sufficient to cause any level of intoxication is what I meant and I figured would be implied considering that we were talking about alcohol and DXM interacting and causing a perceived change in intoxication from alcohol.

Further, your assertion that NMDA antagonism would cause tolerance to alcohol is flat out wrong. It could cause tolerance to the NMDA antagonist effects of alcohol, however, it has been shown that NMDA antagonists prevent tolerance from developing to alcohol:

http://www.ncbi.nlm.nih.gov/pubmed/1831064
http://www.ncbi.nlm.nih.gov/pubmed/14684445

4.5.1 Ethanol Intoxication

Ethanol intoxication is measured in rodents by the length of sleep time upon systemic injection of hypnotic doses (3–4 g/kg) of ethanol. Miyakawa et al. showed that Fyn deletion mice were more sensitive to intoxicating doses of ethanol and therefore their sleep time was longer than the Fyn+/− mice [63]. We found that systemic administration of the NR2B-specific inhibitor, ifenprodil, together with ethanol increased the length of sleep time of the Fyn+/+ mice to the same level as the Fyn−/− mice [124]. Taken together, these results suggest that Fyn-mediated phosphorylation of NR2B subunits and the development of acute tolerance reduce the in vivo sensitivity to hypnotic doses of ethanol.

Support for a potential role of NR2B-containing NMDARs in the attenuation of the level of intoxication was reported in a recent study in which systemic inhibition of NR2B-containing NMDARs with CGP-37848 or Ro-25-6981 significantly increased sleep time in C57BL/6J mice [125]. These results are also in line with numerous studies by Kalant and colleagues showing that the NMDAR antagonists (+)-MK-801 and ketamine blocked the development of rapid tolerance to ethanol exposure in vivo [126–128].

http://www.ncbi.nlm.nih.gov/books/NBK5284/

It stands to reason that if ketamine has the ability to do this that dextr(ometh)orphan could do so as well.

Sorry that I don't have the money to fund a study and have to go based off of the reports of people who are using the NMDA antagonism model rather successfully.

If its all placebo then its a wonderful placebo and I'll happily continue to use it and recommend it. So far I've kicked nicotine, alcohol, and I'm half way down my benzos with none of the usual intense withdrawal side effects except some mild insomnia and occasional diarrhea so that's proof enough for me. I haven't even had a craving for nicotine or alcohol in a month and a half. I have no doubt in my mind if I were to drink a few beers I would be smashed but I'm not about to mess up my progress to make a point that according to you has no merit without a true full-blown scientific study that one can skew the results in favor of their argument much like the drug companies do. It isn't worth that much to me.

Actually, here are some more:

After lorazepam discontinuation, binding was increased at 4 and 7 days versus chronically treated animals and versus vehicle within the cerebral cortex. This effect was abolished by coadministration of CPP as well as by CPP administration during the lorazepam withdrawal period. These data support the involvement of the glutamatergic system in benzodiazepine tolerance and discontinuation.

http://www.karger.com/Article/Abstract/139531

This one in particular is of interest regarding the NMDA/AMPA antagonist combo that my psychiatrist uses:

Long-term treatment leads to tolerance to and dependence on benzodiazepines. Abrupt termination of benzodiazepine administration triggers the expression of signs of dependence. Mice withdrawn from chronic treatment with diazepam showed a time-related evolution of anxiety, muscle rigidity, and seizures between days 4 and 21 after treatment discontinuation. A period between withdrawal days 1 and 3 was symptom-free. Surprisingly, during this "silent phase" the susceptibility of mice to alpha-amino-3-hydroxy-5-tert-butyl-4-isoxazolepropionate (ATPA) and kainate seizures and the magnitude of monosynaptic reflexes mediated by non-N-methyl-D-aspartate (NMDA) mechanisms were enhanced. In apparent contrast, the "active phase", between withdrawal days 4 and 21, was characterized by increased susceptibility to NMDA seizures and enhanced magnitude of polysynaptic reflexes, which are NMDA dependent. Treatment of mice with alpha-amino-3-hydroxy-5-methyl-4-isoxazolepropionate (AMPA) antagonists 1-(4-aminophenyl)-4-methyl-7,8-methylenedioxy-5H-2,3-benzodiazepine (GYKI 52466) or 2,3-dihydroxy-6-nitro-7-sulfamoylbenzo(f)quinoxaline but not with the NMDA antagonist 3-[(+/-)-2-carboxypiperazin-4-yl]-propyl-1-phosphonate (CPP) during the silent phase prevented signs of dependence. In contrast, treatment with CPP but not with GYKI 52466 during the active phase prevented the symptoms. The development of tolerance to and dependence on diazepam was prevented by concurrent treatment of mice with CPP but was not prevented by GYKI 52466. These data indicate that NMDA-dependent mechanisms contribute to the development of tolerance to diazepam and to the expression of signs of dependence in mice after termination of long-term treatment with diazepam. Nevertheless, the non-NMDA-mediated silent phase is essential for triggering the symptoms. Therefore, AMPA antagonists may offer a therapeutic approach for preventing dependence on benzodiazepines that is an alternative to NMDA antagonism.

http://www.pnas.org/content/90/14/6889.short

We have reexamined the effect of NMDA antagonists [(+)MK-801 and ketamine] on rapid tolerance to chlordiazepoxide. (+)MK-801 and ketamine blocked the development of rapid tolerance to chlordiazepoxide, but this effect was dependent on the dose ratio of the NMDA antagonist to that of the benzodiazepine used to produce rapid tolerance. Furthermore, NMDA antagonists blocked both learned and unlearned tolerance to chlordiazepoxide. It appears that in addition to impairment of memory and learning, NMDA antagonists may also influence some other mechanism involved in the production of drug tolerance.

http://www.sciencedirect.com/science/article/pii/S0361923096002195
 
Last edited:
That's better, why didn't you just post that in the first place?

However, there are still considerable things you are suggesting as fact that aren't.

I don't remember where I read that alcohol was not a potent NMDA antagonist in low concentrations but given the multiple levels of intoxication that alcohol causes, it is definitely plausible.

Amphetamine causes multiple levels of intoxication, so does DXM, Diphenhydramine, and every other drug out there. If you take less, you will be "less" intoxicated. Simple as that.

When it comes to ACTIVE DOSE, by your logic, any dose of anything is active which makes this whole argument moot. 60mg released slowly over a 24 hour period is not sufficient to cause any level of intoxication is what I meant and I figured would be implied considering that we were talking about alcohol and DXM interacting and causing a perceived change in intoxication from alcohol.

That's not what your first post said - you said "60mg of Dextromethorphan a day is nothing". There is nothing here to make me believe that you were talking about recreational doses. Your post implied that DXM has no effects on the body at that dose or that it's effects were negligible. If that were true, then why would DXM slow the buildup of tolerance at that dose?

It's not enough to make you fucked up, but that's not what this is about. This is about taking therapeutic doses of Dextromethorphan or some other NMDA antagonist. I'm not here to argue about getting fucked up on K or DXM - this is about taking some sort of Glutaminergic blocker so it doesn't produce intoxicating effects on it's own.

Further, your assertion that NMDA antagonism would cause tolerance to alcohol is flat out wrong. It could cause tolerance to the NMDA antagonist effects of alcohol, however, it has been shown that NMDA antagonists prevent tolerance from developing to alcohol

While that's interesting you fail to address one point. Where does it say that NMDA antagonists prevents tolerance?

The study suggests that it slows tolerance to ethanol from building up. It never says it inhibits tolerance entirely. Also, if NMDA antagonists slow tolerance to Ethanol, then why doesn't Ethanol not prevent tolerance to itself?

After lorazepam discontinuation, binding was increased at 4 and 7 days versus chronically treated animals and versus vehicle within the cerebral cortex. This effect was abolished by coadministration of CPP as well as by CPP administration during the lorazepam withdrawal period. These data support the involvement of the glutamatergic system in benzodiazepine tolerance and discontinuation.

The Lorazepam study says nothing about decreasing tolerance to Lorazepam, only slowing it. It does state that taking CPP decreased the amount of time the rats went through withdrawals, which is interesting none-the-less, but even the study admits the effects were partly responsible for additive effects that CPP can produce alone.

I also fail to see what any of this has to do with Ethanol being dopaminergic or how having a history of alcoholism in your family increases tolerance.

SwampFox
 
Also, if NMDA antagonists slow tolerance to Ethanol, then why doesn't Ethanol not prevent tolerance to itself?

Ethanol is not primarily a NMDA antagonist. It's a very dirty, broad spectrum drug.

Where does it say that NMDA antagonists prevents tolerance?

uh.... NMDAR antagonists (+)-MK-801 and ketamine blocked the development of rapid tolerance to ethanol exposure in vivo ? Among other studies, of course.[1][2][3][4]

Either way, neither of the above have been confirmed. This requires much more research before anyone can claim that Ethanol does, infact, raise dopamine levels.
Preferential stimulation of dopamine release in the nucleus accumbens of freely moving rats by ethanol.

Ethanol is not directly dopaminergic but it does release dopamine in NAcc by some combination of its effects (GABA-A?). Hence why it is rewarding and reinforcing.

Also, just because your family has a history of alcoholism doesn't mean you will automatically be tolerant to alcohol - that doesn't even make sense.

Some people have the genetics to tolerate alcohol exposure better than others, or a genetic predisposition to the euphoric effects of alcohol.

How can you sit here and criticize me when you have a very limited understanding of this information?

This is like watching kindergarteners argue.
 
Last edited:
Ethanol is not primarily a NMDA antagonist. It's a very dirty, broad spectrum drug.

....I know. He tried to claim that Ethanol has lower binding affinity to N-Methyl-D-Aspartate in smaller doses. Obviously, increasing levels of Ethanol will increase how much can bind to the NMDA receptors. But I fail to see how ingesting more ethanol would increase it's ability to form a complex with a receptor.

uh.... NMDAR antagonists (+)-MK-801 and ketamine blocked the development of rapid tolerance to ethanol exposure in vivo ? Among other studies, of course.[1][2][3][4]

Reread what I said. I acknowledged that it does slow the buildup of tolerance, but none of those studies suggested that taking an NMDA antagonist stops tolerance entirely. You even said that above! THAT'S WHAT THIS WHOLE THREAD IS ABOUT!!!! *look of disapproval*

Ethanol is not directly dopaminergic but it does release dopamine in NAcc by some combination of its effects (GABA-A?). Hence why it is rewarding and reinforcing.

Once again, I think you failed to read my post entirely. I acknowledged this, but he was trying to claim that Ethanol itself has some sort of direct dopaminergic effects.

Some people have the genetics to tolerate alcohol exposure better than others, or a genetic predisposition to the euphoric effects of alcohol.

Really sekio? This isn't even related to what I said. I'm aware there's genes that make people more susceptible to addiction. However, I remember there being a huge discovery last year how they were able to prove a couple things in the genes of people with ADHD. Again they were able to prove, for the first time, a mutated gene that causes a mental illness. The gene that controls the development of DAT (the gene itself is called DAT1) is overly expressed in people with ADHD.

They were also to prove that people with an over expressed DAT1 gene were way more likely to become addicted to Nicotine or Ethanol at some point in their life. So I know there is a definite correlation between genes and likelihood of addiction. But I fail to see how the presence/lack of presence/increased presence could cause someone to have tolerance to Ethanol before they've even used it.

Yes, genes can make you more susceptible to addiction and I believe your right in the fact that tolerance to Ethanol could be built up faster by someone who has a gene disposition - but not initially. If that is what you were suggesting, I don't believe it. I'd have to see studies before I believed that Ethanol tolerance could exist before administration had occurred.

This is like watching kindergarteners argue.

Really? You're knowledge of chemistry is way more advanced than mine - I don't deny that. But I'm very solid in my understanding of Pharmacodynamics - I'm starting to think you like correcting me for the sake of correcting me.
 
Last edited:
sekio already responded to most of this so I'm going to just make a few comments.

That's better, why didn't you just post that in the first place?

Because I didn't have the time to go through and fetch articles at that time.

That's not what your first post said - you said "60mg of Dextromethorphan a day is nothing". There is nothing here to make me believe that you were talking about recreational doses. Your post implied that DXM has no effects on the body at that dose or that it's effects were negligible. If that were true, then why would DXM slow the buildup of tolerance at that dose?

What I was talking about was that you said that most likely my perceived tolerance reduction was a combination of DXM and alcohol. I responded by stating that "60mg of Dextromethorphan a day is nothing" but meant to imply that it is because it is not recreational at that dose.

The Lorazepam study says nothing about decreasing tolerance to Lorazepam, only slowing it. It does state that taking CPP decreased the amount of time the rats went through withdrawals, which is interesting none-the-less, but even the study admits the effects were partly responsible for additive effects that CPP can produce alone.

I already stated that "I don't use NMDA antagonists for tolerance, only to ward off most of the withdrawal symptoms" in my initial post. My goal is to get off of 9 years of prescribed benzodiazepines and I am using NMDA antagonism to do it. That said, I successfully used memantine and Delsym before to drop from 60mg of diazepam a day to 20mg of diazepam in a month and a half period. This was after cutting over from 4mg of Klonopin and thus being in mild tolerance withdrawal because I did not switch to 80mg which would have been the proper equivalent dose. I went down to 20mg with little effort and hit 15mg before I ran out of Delsym and I had a series of seizures which prompted me to go back to 20mg and I stabilized. So while I am not using it for that purpose this time, I have used NMDA antagonism in the past to drop my dose much more rapidly than I would have safely been able to do so without which suggests an open research area.

I also fail to see what any of this has to do with Ethanol being dopaminergic or how having a history of alcoholism in your family increases tolerance.

sekio explained the dopaminergic part and part of the tolerance issue but here is some more information about genetic alcohol tolerance, I don't have access to the original articles but they are listed at the bottom of the About link.

Tolerance and the Predisposition to Alcoholism

Animal studies indicate that some aspects of tolerance are genetically determined. Tolerance development was analyzed in rats that were bred to prefer or not prefer alcohol over water (26,27). The alcohol-preferring rats developed acute tolerance to some alcohol effects more rapidly and/or to a greater extent than the nonpreferring rats (26). In addition, only the alcohol-preferring rats developed tolerance to alcohol's effects when tested over several drinking sessions (27). These differences suggest that the potential to develop tolerance is genetically determined and may contribute to increased alcohol consumption.

In humans, genetically determined differences in tolerance that may affect drinking behavior were investigated by comparing sons of alcoholic fathers (SOA's) with sons of nonalcoholic fathers (SONA's). Several studies found that SOA's were less impaired by alcohol than SONA's (28,29). Other studies found that, compared with SONA's, SOA's were affected more strongly by alcohol early in the drinking session but developed more tolerance later in the drinking session (30). These studies suggest that at the start of drinking, when alcohol's pleasurable effects prevail, SOA's experience these strongly; later in the drinking session, when impairing effects prevail, SOA's do not experience these as strongly because they have developed tolerance (30). This predisposition could contribute to increased drinking and the risk for alcoholism in SOA's.

http://alcoholism.about.com/cs/alerts/l/blnaa28.htm

I can confirm that it is very much like that for me.
 
Top