All matter, energy, events and other manifestations of our perceivable reality are formed of dynamic, systolic loops - conventionally referred to as "strings" - which vibrate within a hyperdimensional manifold. The less measurable physical properties, such as gravity waves and the weak electromagnetic force, are merely so because they expend a significant portion of their energy into the higher dimensions. Indeed, what classical science regards as quanta and subatomic particles are merely the "tracings" of these loops within a four-dimensional subset of that manifold.
While this concept continues to gain increasing acceptance in the theoretical physics community - unifying the conflicting fields of general relativity and quantum mechanics under the guise of "string" & "m-brane" theory - it still fails to identify what governs and defines the dynamics of these trans-dimensional loops. This parametric anti-space is what we've come to refer to as the "algorithmic domain", whose all-encompassing nature is marginally depicted in the diagram accompanying "The Formulation of Making" (click the diagram to enlarge it). An Algorithm is step-by-step problem-solving procedure... an established, recursive computational set of instructions for solving a problem in a finite number of steps. Informal algorithms are like verbal directions to someone's house...not exact and somewhat open to interpretation. Formal algorithms are absolutely specific without any room for interpretation or change.
This mirror-image reciprocal domain is populated with the smallest parts of material reality...individual "partials" that together make up the larger manifold self...indivisible units of reality, each of which represents the unified holographic essence of the whole...automatous agents of change that define the "state space" (manner of being) and "rule space" (manner of doing) of the manifestation in question. These partials are collectively referred to as Automata.
Think of the generative loops of manifestation (strings) as flowing "vector fields" which freely permeate the membrane of the algorithmic domain, during which change is effected upon them in accordance with the rule-space of that domain - an adaptive / stochastic construct which defines or modulates the behavior of the loops according to their manner. If we want to find the seat of natural law, this is the place to go...as it embodies the generative essence of all reality. In addition, if we want to effect change on something, this is the place to do if from, as this is the point at which the strings interact with the rules.
However, the true beauty and utility of this domain lies beyond its procreant and governing capabilities. By establishing our own algorithmic structures and endowing them with appropriate temporal & spectral "control handles", we are able to modulate the behavior of the operators within this domain to our benefit (provided, of course, that our own creations remain harmonious with the natural laws already resident therein). We need only to introduce these new components at the point of transition / transformation - what I've previously called the "bridge" - in order to introduce them into the algorithmic manifold.
While we can easily determine the effectiveness of this infusion by comparing the objectives of our algorithm with the manifest results, reverse-engineering the process is not so straightforward. The technology to "read" existing automata - even those structures of our own creation that we introduce into the environment - is not currently available. Thus, we're stuck with relying upon empirical evidence of the functioning of our algorithms, and using this as a means of feedback for assessing and modifying their rule-space. This, combined with algorithmic modeling in a suitable environment, is how we conceive, create & test the feasibility of a given construct.