• Philosophy and Spirituality
    Welcome Guest
    Posting Rules Bluelight Rules
    Threads of Note Socialize
  • P&S Moderators: JackARoe | Cheshire_Kat

What is it for a theory to be parsimonious?

ebola?

Bluelight Crew
Joined
Sep 21, 2001
Messages
22,070
Location
in weaponized form
Okay. So Occam's Razor is rather well accepted as a heuristic (but not a monolithic trend). But what makes one explanatory hypothesis or theoretical framework simpler than another? Should the math underlying the modeling be less computationally intensive? Should the number of justifying axioms be minimized?

It seems to me that in a lot of cases, people consider abandonment of common-sensical assumptions a mark of reduced parsimony, but I don't see why this would be the case.

ebola
 
Isn't Occam's Razor focused on minimising the number of assumptions to display its simplicity? So if you're going to tell me that "assumptions" and "axioms" are not interchangeable terms here, I then will ask why they are different. Or was your final question just a bit of Occam's Rhetoric?

Anyway, a few reasons I can think of for why Occam's Razor is the method that achieved a proper, capitalised name. I would assume that going with a common-sense approach that is simple to understand would 1) allow a wider team of individuals to readily understand the framework and in quick time so more minds can be following the experimentation in its early stages as more complicated frameworks could be conjured up. 2) it may be easer and/or cost-effective to backtrack out of methods that are simpler and more straightforward.

people consider abandonment of common-sensical assumptions a mark of reduced parsimony

"Reduced parsimony" would mean more willingness to use resources. So abandonment of common-sensical assumptions would reduce parsimony by way of additional trial-and-error as many people may require several iterations of experimentation before the assumptions actually make sense. Not everyone is going to understand advanced assumptions on paper before getting their hands dirty (I guess? I know I wouldn't want to start until I did).
 
Isn't Occam's Razor focused on minimising the number of assumptions to display its simplicity? So if you're going to tell me that "assumptions" and "axioms" are not interchangeable terms here, I then will ask why they are different. Or was your final question just a bit of Occam's Rhetoric?

Sorry that I was unclear. I meant to draw attention to my opinion that simplicity of assumption needs to mean something more sophisticated than just minimizing the raw count of assumptions made.
...
We might be talking about disparate matters here. I am thinking of the use of parsimony in increasing the likelihood of a theory's validity by direct virtue of its simplicity, whereas you appear to be talking about facilitation of practices of verification.

ebola
 
I love Occam's razor. It's certainly a rich topic.

It's not merely just the number of computations. I.E. Would it take less math to explain the flow of a river down a hill using gravity and the principle of "path of least resistance" (the water will always go where it's easiest to go), or that water just "inherently likes" to make loopty-loops in the earth and get to the lowest point possible. The first assumption would involve the measure and calculation of the substrate through which the water flows, the knowledge of gravity, temperature/climate, and all the other natural laws that would come into play in the formation of a river. The second assumption just imbues the water itself with the property of "wanting" to make bendy loops in the ground because "that's just what water does." How much calculation came into the picture with the second assumption?
The assumption of course leads to other assumptions which are simple, and seem to make sense, such as a bowling ball would fall faster than a pebble due to more "stuff" to do the "wanting." Clearly, the larger a group of things that "want" to do something, the more of that thing they will do. Such as matter enjoying to get to the lowest point possible. Pebbles only have a tiny bit of material to do the "wanting." This only lead to minor misunderstanding of the physical world that took about 1800 years to correct, but it was a simple explanation.
 
Simplicity is a difficult concept to define satisfactorily. The best answer I can venture regarding a theory's degree of simplicity involves representing its prepositional bones as a network of logic gates on a circuit board and simply measuring the voltage required to "execute" it. The lower the voltage the more simple the theory. Obviously this solution doesn't account for every consideration we'd like to entertain in understanding theoretical simplicity, but perhaps some analogue of it could be derived by using some sort of highly refined fMRI technique to quantify the apparent bioelectrical energy needed to subjectively understand any newly learned theory. Within such a hypothetical framework theoretical simplicity could be measured directly as caloric expenditure within learning subjects. It might seem like an elaborate and indirect way to define theoretical simplicity, and it would certainly require a lot of statistical treatment and data sifting, but it's also true that more cognitively demanding tasks quite literally require more physical energy, and so I think the approach possesses at least some degree of explanatory utility.

I also find this operational efficiency approach appealing for understanding notions of intelligence. A person with greater intelligence simply does not have to work as hard to understand novel concepts because their conceptual network is more efficient.
 
Last edited:
coffee said:
It's not merely just the number of computations. I.E. Would it take less math to explain the flow of a river down a hill using gravity and the principle of "path of least resistance" (the water will always go where it's easiest to go), or that water just "inherently likes" to make loopty-loops in the earth and get to the lowest point possible

I like this analogy and agree completely. The parsimony of the initial axioms held appears to defy quantification in terms of computational complexity. Namely, holding the axiom that water does something it 'wants' seems less parsimonious than integrating water into more generalized physical theories because the imbuement of physical entities with capacity to will (or something similar) leads to a host of unwanted complications when investigating physical phenomena in general. It is for this reason that positing the existence of a god lacks parsimony to the point of standing as unscientific: use of such a conceptual tool involves positing that there is something that can do anything, thus explaining everything, in having an arbitrarily extendable number of implications.

pseudonym said:
perhaps some analogue of it could be derived by using some sort of highly refined fMRI technique to quantify the apparent bioelectrical energy needed to subjectively understand any newly learned theory.

Not quite. Because of how the brain functions as a distributed and widely connected network, sum propagation of action potentials seems to correlate poorly with processing intensity; sum expenditure of energy between undirected waking attentiveness and intense conceptual work appear rather similar, for example. Put quite roughly, even though the maintenance of the CNS is extremely metabolically intensive, this intensity does not correlate with sum mental processing that reaches levels of subjective awareness.
...
I like your line of thought with the analogy with silicon-implemented computation though. What it leaves out, as I expect you aware, is scrutiny of the initial axioms that render the results of the computation being performed being intelligible, axioms rooted in the social processes through which humans produce meaning.

ebola
 
Can you define what you mean by parsimonious? I'm not familiar with the word but even after looking it up I still don't quite know how it applies to the question.
 
eballsack? said:
But what makes one explanatory hypothesis or theoretical framework simpler than another?
Complexity is measured by number of constant/variable pairs that make the model. The typical algorithm followed is take all your data points and put them on a graph. Make an assumption about the type of model the data represents(diff eq, neural network, etc...), plug in your data points as factors then adjust the constants until it fits the curve. The problem with too many factors is curve overfitting which causes lack of generalization. The more variables or data you have the more likely it is you'll find correlations that apply to your data but don't generalize to the world. If the contribution of a factor is barely within significant figures it makes sense to pull it out even if it's technically giving a more precise model.


Should the math underlying the modeling be less computationally intensive?
Your generalization is sort of true but I don't think it'd make a good real world measure. Having more factors in your equation would be a source of increased computation. But so would using division and logarithms which may or may not indicate a more complex model.


Should the number of justifying axioms be minimized?
An axiom can represent a relationship between countless factors so there's a preference for simplicity within axioms.




http://en.wikipedia.org/wiki/Overfitting
 
Foreigner said:
Can you define what you mean by parsimonious?

Not really, as this thread's entire point is to come up with a working definition of parsimony. ;) However, a hypothesis that explains observations simply and elegantly (possibly minimizing the number of underlying axioms adopted) is parsimonious.


Mr. Shitlord said:
Complexity is measured by number of constant/variable pairs that make the model. The typical algorithm followed is take all your data points and put them on a graph. Make an assumption about the type of model the data represents(diff eq, neural network, etc...), plug in your data points as factors then adjust the constants until it fits the curve. The problem with too many factors is curve overfitting which causes lack of generalization. The more variables or data you have the more likely it is you'll find correlations that apply to your data but don't generalize to the world. If the contribution of a factor is barely within significant figures it makes sense to pull it out even if it's technically giving a more precise model.

I think that this conception of parsimony is quite workable when we are talking about statistical modeling (and in particular work with the general linear model). However, how should we treat the body of implicit assumptions that imbue particular models with meaning and according connections to other theoretical frameworks (as science constructs its empirical object, I believe that no hypothesis or model can really be treated 'in vacuo'; science produces theory as a corporate body)?


An axiom can represent a relationship between countless factors so there's a preference for simplicity within axioms.

Right. I'm wondering what it is to choose a parsimonious body of axioms.

eballsack?

Sorry, that's "eblowla" to you, hun. ;)

ebola
 
Last edited:
Sorry that I was unclear. I meant to draw attention to my opinion that simplicity of assumption needs to mean something more sophisticated than just minimizing the raw count of assumptions made.

so essentially, you would need to find a way to quantify the "degree of complexity" of a given assumption? and then the parsimoniousness of a given theory would be a linear combination of the degrees of complexity of all its constituent assumptions, i guess

this is a very interesting thread

it boggles my mind how one might go about quantifying complexity in such a way (does one measure it against a metric with a "maximum degree of complexity"? e.g. the entirety of the physical universe, and all its underlying logic (e.g. mathematical reality, what plato would have called the "world of forms", etc)), but my intuition tells me there is probably some ultra-clever way to do it that people much smarter than me (or superintelligent philosopher robots of the future) could figure out
 
Last edited:
Roger&Me said:
so essentially, you would need to find a way to quantify the "degree of complexity" of a given assumption? and then the parsimoniousness of a given theory would be a linear combination of the degrees of complexity of all its constituent assumptions, i guess


If we're equating degree of complexity with parsimony then we have a problem. The models explaining the greatest amount of phenomena are going to have the least parsimony. For example, relativistic mechanics is going to make more assertions than newtonian mechanics even though the former explains the latter. You'd need to normalize with a measure of explanatory power or vice-versa. The idea of parsimony is for comparing two models of the same phenomena which this thought experiment does not do.






ebola? said:
However, how should we treat the body of implicit assumptions that imbue particular models with meaning and according connections to other theoretical frameworks (as science constructs its empirical object, I believe that no hypothesis or model can really be treated 'in vacuo'; science produces theory as a corporate body)?


Science doesn't happen in a vacuum but it does organize into internally consistent islands. Cross this divide and you run into shit like the "hard problem of consciousness". Trying to pit Derrida against Einstein is meaningless.
 
Top