Weltmeister
Bluelighter
- Joined
- Oct 23, 2015
- Messages
- 231
Your best course of action is talking to a sex addiction therapist
N&PD Moderators: Skorpio | thegreenhand
I can't think of how else to frame my question, but is the trip as "hardcore" as people make it out to be? What kind of preparation do you need to do?
There really isn't any criteria that exists to define an excessive increase in gene or protein expression. Remember, there isn't even a consistant way to measure expression level because studies don't normally measure the absolute amount of protein present (there isn't an easy way to do that), but rather typically use indirect measures such as immunostaining intensity, western blot, or cell counting. Those are usually comparative methods (ie, this tissue section has a higher level then this other tissue section). It is often impossible to compare those results across studies because of sensitivity differences. Mass-spec methods can be used to detect protein levels directly but are rare.
Often, increases induced by non-natural manipulations (e.g., viral constructs, drug administration) are classified as overexpression, whereas increases induced by natural means (stress, sex, etc) are classified as expression increases.
See what I wrote above. Usually overexpression is used in the context of an artificial manipulation, whereas increase is usually used for changes induced by natural means.
Medium spiny neurons (MSNs). Cocaine, ethanol, THC, fluoxetine, and social defeat stress in resilient mice induce expression in D1 MSNs; haloperidol induces expression in D2 MSNs; morphine, heroin, sucrose, calorie restriction and juvenile environmental enrichment induce expression in D1 MSNs and D2 MSNs:
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3834048/#!po=54.1322
Remember, there isn't even a consistant way to measure expression level because studies don't normally measure the absolute amount of protein present (there isn't an easy way to do that), but rather typically use indirect measures such as immunostaining intensity, western blot, or cell counting.
That's not true at all. We use droplet digital PCR to perform absolute quantification of transcripts, and ELISA for protein quantification. For my research, an ELISA is time-consuming, but it's reliable, sensitive and can be incredibly accurate.
It may not have been clear from my post, but I was specifically referring to measurement of ΔFosB in relevant studies that we have been discussing. Most studies measuring ΔFosB expression in a cell- and brain region-specific manner use immunohistochemistry or Western Blot for detection. I never said there aren't other methods that could be used to detect and measure ΔFosB expression for the same purposes, but those methods are not typically used by studies relevant to the present topic.
.Genetic (i.e., non-viral) overexpression of ΔFosB in a set of neurons can be objectively determined based upon the occurrence of the virally-induced behavioral phenotype following chronic exposure to a stimulus in conjunction with measurements of its expression (see the definition I quoted earlier).
The behavioral phenotype induced by ΔFosB overexpression in D1-type NAcc MSNs is mentioned in the review that I mentioned earlier
So, what's the reason they don't? There are obviously Taqman probes for ΔFosB, so I'm not sure why they're not used for absolute quantification. I also don't know what you're getting at with the methods not being relevant; I'm a fan of using whatever it takes to get the data you need, regardless of how "standard" your methods are.
There are two reasons. First, for any given study, there is no absolute need to perform a strict quantification. The studies just want to measure a response to some manipulation, and relative changes are good enough to identify and quantify a relative change.
Second, it is extremely difficult to look at expression levels using the techniques you mentioned in a manner that is cell- and region-specific. Groups are working on ways to run flow cytometry so that you could sort all of the cells in a brain section micropunch -- which would allow some of those other techniques to be used -- but that still is not common.
You use ELISA to measure expression of 2 or more proteins within individual neurons in a single brain region? We are not talking about studies using PCR, because that wouldn't measure accumulation of DFosB over an extended period of time. For these studies, there is a requirement to detect two proteins (or one protein and one transcript) in individual cells because the studies have to measure DFosB accumulation and classify the cell chemotype. The classification and measurement also has to be done in a manner that addresses regional differences within striatal subregions.Right, an entirely method of gene expression quantification was developed which was extraneous and unnecessary. I don't buy that for a minute.
Of course, whenever I do gene expression I'm looking for relative changes, but we still use ddPCR because you get unparalleled accuracy and predictability from a single PCR reaction. As far as gene expression goes, there's absolutely no way to get that amount of statistical power in such a short period of time.
Weird, I do that every day. Difficult does not mean impossible.
You use ELISA to measure expression of 2 or more proteins within individual neurons in a single brain region? We are not talking about studies using PCR, because that wouldn't measure accumulation of DFosB over an extended period of time. For these studies, there is a requirement to detect two proteins (or one protein and one transcript) in individual cells because the studies have to measure DFosB accumulation and classify the cell chemotype. The classification and measurement also has to be done in a manner that addresses regional differences within striatal subregions.
I never argued that the techniques you mentioned are generally extraneous and unnecessary, but they may be for certain applications. There are always a range of techniques that could be used to test a hypothesis. Given that immunocytochemistry is the standard way to conduct the analysis we are discussing, is it really surprising that it is common to see studies using it? It isn't even necessarily possible to get reviewers to allow you to use a relatively new approach when established procedures work very well.
It is actually almost certain that 18-MC produces hallucinogenic effects. The goal of synthesizing 18-MC was to develop a version of ibogaine that doesn't produce effects on sigma-2 sites, because those interactions are believed to be the cause of ibogaine-induced tremor and purkinje cell degeneration. But 18-MC retains ibogaine-like kappa affinity:
https://www.google.com/url?q=http:/...ggTMAQ&usg=AFQjCNGm2d17gUF1cCXgjX9IvYROSGHrxg
The kappa receptor (KOR) is likely the primary mechanism for the hallucinogenic effects of ibogaine. Ibogaine has very interesting effects at KOR -- ibogaine and noribogaine are functionally-selective KOR agonists. Usually KOR agonists produce hallucinations and dysphoria; part of the stress response and withdrawal dysphoria is due to release of dynorphin, which activates KOR. The functional selectivity of ibogaine and especially noribogaine may allow them to induce a KOR-mediated hallucinogenic response without inducing dysphoria; furthermore, noribogaine may persistantly suppresses drug withdrawal by occupying KOR for an extended period, which would prevent dynorphin from binding.