#### Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Options

# The linear recursion of "distinction", "dimension", and "ordered class"

edited January 2016 in General

I have once again begun studying the fundamentals of semantic ontology and category/classification theory, and am again considering the once-seemingly explosive thesis that it is possible to build the entire semantic structure of classification and all cognitive processes in terms of ONE extremely simple algebraic primitive element -- that element being the concept of "distinction" -- probably understood as a "cut" -- as per the concept of "Dedekind Cut". Years ago, after a ton of work on these ideas, I became persuaded that the entire structure of taxonomy, for example, can be mapped in terms of dimensions, and that all the dimensions in this structure were composed of "distinctions". So, this led to the wild and intriguing doctrine that the entire structure of cognition takes the form of "a cut on a cut on a cut on a cut on a cut" (ie, descending levels of specificity in a cascade like a taxonomy). It feels like I am going to push this hard for a little while, so I might stick my deep confusions or accelerating enthusiasm into this framework. Thanks John, thanks all.

• Options
1.
edited January 2016

I did do quite a bit of writing on this theme years ago, with a strong bibliography, etc. But I could never push this thesis past a certain point, and "the great theorem" that I suspect(ed) is somehow inherent here remained a Quixotic fantasy. In the last couple of weeks, I started pushing these ideas into a new word-processing format, and started dumping stacks of Wikipedia articles into my document, because I see so many articles there that seem like "parts" or facets of some larger integral design that I imagine embodies a stunning elegance and simplicity. For me, concept theory, the hierarchical layering of computer languages, and the hierarchical layering of computer hardware, all seem to embody the same "linear perfection". It's all so simple, it could be explained in the opening lecture of a compsci 101 class for freshmen at any community college. This is not very sophisticated stuff. But where it becomes sophisticated -- is because preservation of the simplicity across all these levels opens up some amazing possibilities that are obscured when combinations do not preserve absolute linearity -- leading to the common popular perception that all this stuff is really complicated and controversial. For a broad review, see the Wikipedia article on Upper Ontology. https://en.wikipedia.org/wiki/Upper_ontology

Comment Source:I did do quite a bit of writing on this theme years ago, with a strong bibliography, etc. But I could never push this thesis past a certain point, and "the great theorem" that I suspect(ed) is somehow inherent here remained a Quixotic fantasy. In the last couple of weeks, I started pushing these ideas into a new word-processing format, and started dumping stacks of Wikipedia articles into my document, because I see so many articles there that seem like "parts" or facets of some larger integral design that I imagine embodies a stunning elegance and simplicity. For me, concept theory, the hierarchical layering of computer languages, and the hierarchical layering of computer hardware, all seem to embody the same "linear perfection". It's all so simple, it could be explained in the opening lecture of a compsci 101 class for freshmen at any community college. This is not very sophisticated stuff. But where it becomes sophisticated -- is because preservation of the simplicity across all these levels opens up some amazing possibilities that are obscured when combinations do not preserve absolute linearity -- leading to the common popular perception that all this stuff is really complicated and controversial. For a broad review, see the Wikipedia article on Upper Ontology. https://en.wikipedia.org/wiki/Upper_ontology
• Options
2.
edited January 2016

Here's a simple big-picture image of "the hierarchy of abstraction" understood as brain-model. Can all these elements be understood as part of a single structure, as I am inclined to suppose? All of this does appear to be reasonably consistent with real-world brain science. Can it all be constructed algebraically, in terms of "one primitive"? Big idea -- but -- an interesting one, if workable....

Comment Source:Here's a simple big-picture image of "the hierarchy of abstraction" understood as brain-model. Can all these elements be understood as part of a single structure, as I am inclined to suppose? All of this does appear to be reasonably consistent with real-world brain science. Can it all be constructed algebraically, in terms of "one primitive"? Big idea -- but -- an interesting one, if workable.... <img src="http://origin.org/graphics/bridge.jpg">
• Options
3.
edited January 2016

And here's the same idea defined strictly as the polar ends ("lowest" and "highest") of a hierarchical spectrum of abstraction -- shown here as the tension between "parts" and "wholes" -- or between "reductionism" and "holism". Does "all logic" really run across this single spectrum? If it does, why isn't this more obvious to everybody? If it doesn't, what is wrong with this model?

Comment Source:And here's the same idea defined strictly as the polar ends ("lowest" and "highest") of a hierarchical spectrum of abstraction -- shown here as the tension between "parts" and "wholes" -- or between "reductionism" and "holism". Does "all logic" really run across this single spectrum? If it does, why isn't this more obvious to everybody? If it doesn't, what is wrong with this model? <img src="http://origin.org/graphics/integralholon.png">
• Options
4.
edited January 2016

This discussion was posted in the "Technical" section. But that section is for discussions having to do with technical aspects of running of the Azimuth Wiki and Forum - problems with the software, problems with spam, etcetera. I have moved this discussion to somewhere more appropriate: "General".

Comment Source:This discussion was posted in the "Technical" section. But that section is for discussions having to do with technical aspects of running of the Azimuth Wiki and Forum - problems with the software, problems with spam, etcetera. I have moved this discussion to somewhere more appropriate: "General".
• Options
5.
edited January 2016

Thanks for putting this discussion in the right place.

Just a tiny question or thought: has anyone explored the algebraic idea of a "cut on a cut" as per "Dedekind cut"?

A Cartesian coordinate frame is usually defined in the X and Y axes in real numbers, and the Dedekind cut might be defined in the X axis somewhere, perhaps as the square root of 2. What if the origin was defined by the X axis being a Dedekind cut in Y and the Y axis being a Dedekind cut in X? Is this completely obvious or trivial -- or wrong headed? It seems possible to explore a bunch of interesting things in this way, by seeing the lowest level of decimal point as defining the unit interval, shown in this graphic as the range from 1 to 2.

PS -- this is my current exploratory superflash regarding "struggles with the continuum" -- maybe a cool way to map the finite to the infinite (?)

Comment Source:Thanks for putting this discussion in the right place. Just a tiny question or thought: has anyone explored the algebraic idea of a "cut on a cut" as per "Dedekind cut"? A Cartesian coordinate frame is usually defined in the X and Y axes in real numbers, and the Dedekind cut might be defined in the X axis somewhere, perhaps as the square root of 2. What if the origin was defined by the X axis being a Dedekind cut in Y and the Y axis being a Dedekind cut in X? Is this completely obvious or trivial -- or wrong headed? It seems possible to explore a bunch of interesting things in this way, by seeing the lowest level of decimal point as defining the unit interval, shown in this graphic as the range from 1 to 2. <img src="http://networknation.net/graphics/DedekindCut3.PNG"> PS -- this is my current exploratory superflash regarding "struggles with the continuum" -- maybe a cool way to map the finite to the infinite (?)
• Options
6.

Sorry, just got to bite on this -- but I just looked at the "struggles" blog, and saw it is currently active -- and a highlighted quote is this:

"Not as a continuous, infinitely divisible quantity, but as a discrete quantity composed of an integral number of finite equal parts"

https://www.physicsforums.com/insights/struggles-continuum-part-4/

This seems to go to exactly the point of my "cut on a cut" exploration -- in that this approach opens up a recursive cascade of decimal places, where each level of decimal place functions as a bounded interval with a lowest and highest value -- with a range of uncertainty extending between them -- and of course, all these decimal place intervals are all "finitely equal" -- and can all be understood as recursively nested stand-alone unit intervals defined as a boundary-value range -- with "the continuum" defined as that unknowable uncertainty "one level below" our capacity to finitely measure...

Comment Source:Sorry, just got to bite on this -- but I just looked at the "struggles" blog, and saw it is currently active -- and a highlighted quote is this: <b>"Not as a continuous, infinitely divisible quantity, but as a discrete quantity composed of an integral number of finite equal parts"</b> https://www.physicsforums.com/insights/struggles-continuum-part-4/ This seems to go to exactly the point of my "cut on a cut" exploration -- in that this approach opens up a recursive cascade of decimal places, where each level of decimal place functions as a bounded interval with a lowest and highest value -- with a range of uncertainty extending between them -- and of course, all these decimal place intervals are all "finitely equal" -- and can all be understood as recursively nested stand-alone unit intervals defined as a boundary-value range -- with "the continuum" defined as that unknowable uncertainty "one level below" our capacity to finitely measure...