It looks like you're new here. If you want to get involved, click one of these buttons!

- All Categories 2.3K
- Chat 499
- Study Groups 19
- Petri Nets 9
- Epidemiology 4
- Leaf Modeling 1
- Review Sections 9
- MIT 2020: Programming with Categories 51
- MIT 2020: Lectures 20
- MIT 2020: Exercises 25
- MIT 2019: Applied Category Theory 339
- MIT 2019: Lectures 79
- MIT 2019: Exercises 149
- MIT 2019: Chat 50
- UCR ACT Seminar 4
- General 69
- Azimuth Code Project 110
- Statistical methods 4
- Drafts 2
- Math Syntax Demos 15
- Wiki - Latest Changes 3
- Strategy 113
- Azimuth Project 1.1K
- - Spam 1
- News and Information 148
- Azimuth Blog 149
- - Conventions and Policies 21
- - Questions 43
- Azimuth Wiki 713

Options

## Comments

I did do quite a bit of writing on this theme years ago, with a strong bibliography, etc. But I could never push this thesis past a certain point, and "the great theorem" that I suspect(ed) is somehow inherent here remained a Quixotic fantasy. In the last couple of weeks, I started pushing these ideas into a new word-processing format, and started dumping stacks of Wikipedia articles into my document, because I see so many articles there that seem like "parts" or facets of some larger integral design that I imagine embodies a stunning elegance and simplicity. For me, concept theory, the hierarchical layering of computer languages, and the hierarchical layering of computer hardware, all seem to embody the same "linear perfection". It's all so simple, it could be explained in the opening lecture of a compsci 101 class for freshmen at any community college. This is not very sophisticated stuff. But where it becomes sophisticated -- is because preservation of the simplicity across all these levels opens up some amazing possibilities that are obscured when combinations do not preserve absolute linearity -- leading to the common popular perception that all this stuff is really complicated and controversial. For a broad review, see the Wikipedia article on Upper Ontology. https://en.wikipedia.org/wiki/Upper_ontology

`I did do quite a bit of writing on this theme years ago, with a strong bibliography, etc. But I could never push this thesis past a certain point, and "the great theorem" that I suspect(ed) is somehow inherent here remained a Quixotic fantasy. In the last couple of weeks, I started pushing these ideas into a new word-processing format, and started dumping stacks of Wikipedia articles into my document, because I see so many articles there that seem like "parts" or facets of some larger integral design that I imagine embodies a stunning elegance and simplicity. For me, concept theory, the hierarchical layering of computer languages, and the hierarchical layering of computer hardware, all seem to embody the same "linear perfection". It's all so simple, it could be explained in the opening lecture of a compsci 101 class for freshmen at any community college. This is not very sophisticated stuff. But where it becomes sophisticated -- is because preservation of the simplicity across all these levels opens up some amazing possibilities that are obscured when combinations do not preserve absolute linearity -- leading to the common popular perception that all this stuff is really complicated and controversial. For a broad review, see the Wikipedia article on Upper Ontology. https://en.wikipedia.org/wiki/Upper_ontology`

Here's a simple big-picture image of "the hierarchy of abstraction" understood as brain-model. Can all these elements be understood as part of a single structure, as I am inclined to suppose? All of this does appear to be reasonably consistent with real-world brain science. Can it all be constructed algebraically, in terms of "one primitive"? Big idea -- but -- an interesting one, if workable....

`Here's a simple big-picture image of "the hierarchy of abstraction" understood as brain-model. Can all these elements be understood as part of a single structure, as I am inclined to suppose? All of this does appear to be reasonably consistent with real-world brain science. Can it all be constructed algebraically, in terms of "one primitive"? Big idea -- but -- an interesting one, if workable.... <img src="http://origin.org/graphics/bridge.jpg">`

And here's the same idea defined strictly as the polar ends ("lowest" and "highest") of a hierarchical spectrum of abstraction -- shown here as the tension between "parts" and "wholes" -- or between "reductionism" and "holism". Does "all logic" really run across this single spectrum? If it does, why isn't this more obvious to everybody? If it doesn't, what is wrong with this model?

`And here's the same idea defined strictly as the polar ends ("lowest" and "highest") of a hierarchical spectrum of abstraction -- shown here as the tension between "parts" and "wholes" -- or between "reductionism" and "holism". Does "all logic" really run across this single spectrum? If it does, why isn't this more obvious to everybody? If it doesn't, what is wrong with this model? <img src="http://origin.org/graphics/integralholon.png">`

This discussion was posted in the "Technical" section. But that section is for discussions having to do with technical aspects of running of the Azimuth Wiki and Forum - problems with the software, problems with spam, etcetera. I have moved this discussion to somewhere more appropriate: "General".

`This discussion was posted in the "Technical" section. But that section is for discussions having to do with technical aspects of running of the Azimuth Wiki and Forum - problems with the software, problems with spam, etcetera. I have moved this discussion to somewhere more appropriate: "General".`

Thanks for putting this discussion in the right place.

Just a tiny question or thought: has anyone explored the algebraic idea of a "cut on a cut" as per "Dedekind cut"?

A Cartesian coordinate frame is usually defined in the X and Y axes in real numbers, and the Dedekind cut might be defined in the X axis somewhere, perhaps as the square root of 2. What if the origin was defined by the X axis being a Dedekind cut in Y and the Y axis being a Dedekind cut in X? Is this completely obvious or trivial -- or wrong headed? It seems possible to explore a bunch of interesting things in this way, by seeing the lowest level of decimal point as defining the unit interval, shown in this graphic as the range from 1 to 2.

PS -- this is my current exploratory superflash regarding "struggles with the continuum" -- maybe a cool way to map the finite to the infinite (?)

`Thanks for putting this discussion in the right place. Just a tiny question or thought: has anyone explored the algebraic idea of a "cut on a cut" as per "Dedekind cut"? A Cartesian coordinate frame is usually defined in the X and Y axes in real numbers, and the Dedekind cut might be defined in the X axis somewhere, perhaps as the square root of 2. What if the origin was defined by the X axis being a Dedekind cut in Y and the Y axis being a Dedekind cut in X? Is this completely obvious or trivial -- or wrong headed? It seems possible to explore a bunch of interesting things in this way, by seeing the lowest level of decimal point as defining the unit interval, shown in this graphic as the range from 1 to 2. <img src="http://networknation.net/graphics/DedekindCut3.PNG"> PS -- this is my current exploratory superflash regarding "struggles with the continuum" -- maybe a cool way to map the finite to the infinite (?)`

Sorry, just got to bite on this -- but I just looked at the "struggles" blog, and saw it is currently active -- and a highlighted quote is this:

"Not as a continuous, infinitely divisible quantity, but as a discrete quantity composed of an integral number of finite equal parts"https://www.physicsforums.com/insights/struggles-continuum-part-4/

This seems to go to exactly the point of my "cut on a cut" exploration -- in that this approach opens up a recursive cascade of decimal places, where each level of decimal place functions as a bounded interval with a lowest and highest value -- with a range of uncertainty extending between them -- and of course, all these decimal place intervals are all "finitely equal" -- and can all be understood as recursively nested stand-alone unit intervals defined as a boundary-value range -- with "the continuum" defined as that unknowable uncertainty "one level below" our capacity to finitely measure...

`Sorry, just got to bite on this -- but I just looked at the "struggles" blog, and saw it is currently active -- and a highlighted quote is this: <b>"Not as a continuous, infinitely divisible quantity, but as a discrete quantity composed of an integral number of finite equal parts"</b> https://www.physicsforums.com/insights/struggles-continuum-part-4/ This seems to go to exactly the point of my "cut on a cut" exploration -- in that this approach opens up a recursive cascade of decimal places, where each level of decimal place functions as a bounded interval with a lowest and highest value -- with a range of uncertainty extending between them -- and of course, all these decimal place intervals are all "finitely equal" -- and can all be understood as recursively nested stand-alone unit intervals defined as a boundary-value range -- with "the continuum" defined as that unknowable uncertainty "one level below" our capacity to finitely measure...`