i used to slowly pick on math over time using contextual logic.
The possible misconcepted function processed over other misconcepted functions
at a ground level which isn?t there, innumerable times to recreate
actuality confines to objectivity as almost-completely other than
actuality.
Univocally, A equals A [equal is a transcendental signified], but it?s
lost in transcendental signified and A equals B or ?A is approximate
to A? for incomplete and synthetic usage.
From the standpoint of ?everything is a text?, each word and context,
also the ?word to a context? is a project apart from the project. This
goes into infinite regression in all directions along with its
reciprocalities and parallels. The parallels are ?lesser?
reciprocalities and the reciprocalities are ?greater? parallels. The
antifoundation and appropriation that stands in the place of the
conceptions of reciprocals/parallels are lost at corners such as: pure
heterogeneity, pure homogeneity, precarious logical formulation of
arithmetic concepts, tautology with false sense in the absence of
sense, etc
Since infinity can be counted to given enough ?time? eternity, the
logical propositions are there, hyperdifference is in opposition to
?one to one? correspondence since what is beyond infinity sum total is
not countable. The ability to count one to one is a pyscholocation
that matches it up within precarious preset institutions that have
culminated to align into correspondence, this doesn?t mean that they
will always be paired up even if the course of eternity matches them
up inside the system. They can only be taken as useful throughout the
entire course. A concept where this would be useful is if the universe
spreads apart so far as each element is divided for infinity, then the
regress back into nothingness, or a structure that is laced together
without physis, wouldnt correspond to anything else in existence, or
each irreducible element if there is a possiblity of such a case,
which is outside the system of division, wouldn?t have any connection
to any other element, since there would be infinite space in every
direction between them.
mathematical concepts go as far as the implantation of the rendering of the spatial location, they cannot breach the bounds of the system, that is capable of going further theoretically into the location, many things are known prior to witnessing if a person has the institutional estimation of the area that points to elementary sections of the location. there are many constants at play in the eradication of a mathematical structure, such as causality, which in turn produces change, which in turn then produces the basic classical determinations of speed, distance, momentum, velocity, direction, spin or skew which is a very obtuse manifestation of spin, but if a classical system is micronized enough, which is where the study certainly started from more or less, a moment of discontinuity happens, where the referentials of these measurements cannot be embedded on a grand system, thus the classical determinations are micronized in reference to the components themselves with little horizon of the pass into the sister measurement system of the same functions. it could be said in realtime, that there is infinite space within the two measurement systems, and the phenomena that is in between is specificity of spacetime itself which includes the functional prerequisites of every constituent makeup in the detectable vicinity. thus an ambiguation and oversimplification has to occur to link the middle ground of the movement of a human or another entity. there is infinite space between the systems because they are phantasm reductions of spacetime, not spacetime itself, the first measurement system is merely incapable of certain phenomena ranges, eventually it breaks on a much larger or much microscopic scale, or if the progression of any certain function such as speed breaches the limitations of the systematical holism where the function is the part. what happens is a transfiguration of the phenomena as a transfiguration of the function of speed, where the mathematical conception of the phenomena ceases to explain its activity. usually what happens is a symbol of infinity in the place of a numerical calculation. these infinity components in physics that get inhibited are not always laymen mistakes, or an actual infinity. it is a rather a gap within the homology of the measurement system, an entry into an unforeseen occurrence. the middle ground is the homology of evolution, the process of the structuralization of spacetime, rather than ulterior inertia based on the decision of a human, or anterior quanta reactions. there is no function that could explain the massive differentiation of spacetime from elementary parts. a case like string theory where their inferenced base conditions have to be more differentiated, and universally existent in order to provide an explanation for the differentiation of the subatomic level. the uncanny part is that a layer behind subatomic phenomena is warranted to provide an explanation for the differentiation of a level above subatomic particles, without an absolute encapsulation of the true differentiation of existence, while quantum theory cannot explain the structuralization of a human. rather, the proposed probability base is proposed to produce enough state of affairs to hypothetically encapsulate the differentiation of existence if the number set is processed. this explains the disjunction of quantum theory, and the large number of theoretical solutions for the link of quantum theory and GR. only a small amount of existence is encapsulated within the sprawling hypothetical solutions. the differentiation of space is close to reaching the number of infinity. for example, the surface of a rock is utterly delinear, but it is reduced to a close enough quantization. infinity is the end of all measurements. that is why it is more viable to think of a physical theory of everything as a tunneling from the measurement system of quantum mechanics to GR, rather a revolution of the mathematical system used to measure instead of the excavation of infinite space. the measurement is hindsight from the functions and qualities that could be derived to exist within entities. for example, the amount of neutrino decay a human releases could be estimated from an average of measuring several humans with close enough characteristics, but in order to calculate the exact amount without simply testing the human in vivo, there has to be a knowledge of elementary molecules and configuration that the human contains, this number would still yield only an estimation.
but then i realized that someone else already did limits. i would have to do something much too radical and in the center to even get close.
what i seem to be doing there is playing with the uniqueness of form and pinning structure with different kinds of infinities. im not sure. well for the uniqueness of form thing its impossible to ground mathematics non-platonically because forms take on a resemblance of a correspondant, but they have undecipherable uniqueness around the edges. nothing is platonic for example, so we can only ground mathematics fully in platonism.
there is a possibility that the uniqueness of form does have a particular regression of hypercomplexity somewhere inside formation. at least knowing how to calculate hypercomplex regions posteriori.