The Algorithmic Journey to Harmonic Resolution

In the goal of effecting transformation, one of the first stops in the journey was to find an optimal environment within which to pursue the algorithmic workings that would provide the necessary tools to do so. Conventional algorithmic tools allow for the employment of scalar (single value) and vector (multidimensional array and matrix) variables, but restrict the latter to a rectangular topology (as in a list or table of numbers). We needed a topology that would serve as a dynamic environment to accommodate flows and interrelationships of a more natural and organic nature.

A natural algorithmic environment would allow for the free interaction of all the elements within that environment...and, even more critically, would allow for the entirety of the environment itself to exist as a "living" and dynamic structure. The algorithm that we called "prilling" gave us the resources to generate this. Through prilling, a matrix could move, flow and even reshape itself according to its function. The prilling algorithm revealed that the most natural topology for dynamic change was one that, while having a defined number of elements, had neither bounds nor borders. Within the matrix, all scalar elements would be interconnected. The matrix, as a whole, would flow through itself, with all elements reaching through a central vertex at which point algorithmic transformation could be effected. The topology this described was that of a dynamic hypersphere...what we came to call "toroidal space".

Now that we had our dynamic working environment, the next step in the journey was to fill it. The algorithm employed at this stage - known as "fractal wave" - provided a means by which to "fill the space" with a highly specific function while requiring only a handful of "seed" values. Analysis tools were developed to "look into" the behavior of these functions, and transduction tools were developed to employ the transformational behavior of these functions in an outward manner. Soon, we had a library of catalytic functions assembled - resources for effecting change.

In the course of using these tools, it was discovered that the specific functions manifest their effects only up to some manner of boundary. These boundaries came to be known as "thresholds", and the interval between them "layers". Thus, a given function within the library would effect its transformation upon and within a given layer. When a layer was "complete", it was time to move beyond the current threshold and on to the next layer to continue the transformative efforts. This required a new function - one specifically "tuned" to the next layer.

Through algorithmic exploration, it was found that the "seeds" for the function needed to address the next layer were present within the phase difference between the scalars of the current layer's function. These differential values - which we called "surds" - could be fed back into the algorithm as "seeds", thus providing the transformative function needed to address the next layer beyond the current threshold that had been reached.

While this algorithmic manner of addressing progressive layers was functional, there was a more fundamental problem facing us: the seemingly endless progression of these layers. .In many cases, a point of resolution could be reached after several iterations though the depths of the layers (as manifest by a "flat-line" condition of constancy)...but others had no apparent terminal point, and some worked into recursive loops of endless repetition. It became obvious that we needed to find a means of resolving the issue of depth, a task which would require that the working environment itself could organically adapt to the natural depth of the subject we wished to address in our transformative efforts. For this task, we had to shift into an analog domain based upon pure flow dynamics.

There were many additional considerations that had to be addressed in the adaptation of our tools into the analog domain. While our analytical resources could remain formal in terms of their algorithmic expression, the dynamics of a process had to be treated more as a "behavioral" consideration...since our analog processor acted more like an organism than the "computer-like" behavior of its prior digital counterpart. There were no more "scalars" - the former scalar constants became unidimensional continuous functions and vector arrays and matrices became multidimensional periodic functions. Even the concept of time had to be expressed to the analog environment.

Earlier in our work, we had arrived at a formal "alphabet" of 28 exponential intervals which were used to define the functions generated through the fractal wave algorithm. In the process of porting these scalar constants into the analog domain, they were found to fall into a singular mathematical octave - the distance between successive powers of 2. When these were expressed into the analog domain as continuous functions, they could not completely fill the octave span.

There was the need for an additional element - the element of time - to render these functions complete from a behavioral perspective. The proof of the time constant was contained within the interval of the entire alphabet when taken as a unified whole, and was only present when the entire family of intervals within the octave was treated as a singular structure. This universal time constant - an interval in and of itself - came to be known as the "diesis".

An algorithmic "clock" was created using the diesis, and, upon introducing this clock to the analog processor, the environment literally "came to life". With the diesis clock in place, there became no limit to what could be dynamically expressed within the analog environment. Far beyond the scope of our fractal synthesis, the introduction of the clock gave way to our interaction with dynamic automatous structures - algorithmic "agents of transformation" which could be employed to "seek out the depths" of our subject. The mathematics were quite simple, but the extraordinary functions that came about revealed that the diesis clock contained far more than its mathematics could express. At this point, we began our explorations into the "mysteries" of the diesis clock ...mysteries that would ultimately lead us to an ecumenical substance of formulation.

The first course of exploration was to render proof by "rebuilding" the clock using the same fundamental algorithmic tools that provided for toroidal space. This required a fine-grained analysis of that which was contained within the clock's function. Although the clock "ticked" at the root frequency of the diesis itself, its function was comprised of a transfinite collection of individual periodic functions which, when summed together, made manifest the function of the clock. All of these periodic functions existed at very specific interval relationships to the fundamental frequency. To generate them all - to provide a complete and comprehensive expression of the transfinite nature of the diesis clock - was beyond the capacity of our existing finite resources. Like our endless layers, we would need an endless number of generators to fully express the clock by algorithm ...at least as far as the time domain was concerned.

We nonetheless sought to optimize the expression of our diesis clock to the greatest extent possible. For this, we employed an additive synthesis engine which allowed for the specification of literally thousands of vector components. Further, this engine gave us control over the time-based parameters of each vector generator - its frequency, its magnitude, and its phase. This afforded a fine degree of specificity to be exercised over the function, even to the point of embedding dynamic automatous structures within it, with the resulting modulated time function itself acting both as the fundamental carrier and the conferred active agent.

At this point, the needs of algorithmic definition became exponentially more demanding. Coefficients had to be provided for each of the thousands of vector generators available in terms of frequency, magnitude and phase. Not even one of the generators could be left undefined, as they all contributed equally to the resulting function...they had to be treated as individual functioning elements and as part of a unified whole. Much of the definition task was performed manually, since current resources only allowed for the analysis of generator coefficients on a one-by-one sequential basis.

In order to enhance the efficiency of defining the coefficients for the vector generators, we began the pursuit of an algorithm which was capable of taking a desired function - such as the diesis clock - and rendering an algorithmic definition of its essence. We saw this as a means by which to provide the settings needed for all of the generators as a whole. The intent was to create an algorithmic "machine" with an input and an output. The input would be fed a vector containing the time-domain expression of the desired function, the algorithm would do its work, and the output would provide a vector containing the algorithmic definition of that function. In addition to automating the definition process, this would further serve to render proofs for the existing definitions (which, up to this point, had been generated by hand).

The first test of this new "machine" was performed with the diesis clock function. A vector containing the time-domain expression of the diesis was fed into the input, and a corresponding vector appeared at the output. While the "machine" produced an output, the vectors within this output did not reflect the frequency, magnitude and phase values needed for definition - they were something entirely different.

Rather than providing a nested vector where each element contained the three values of definition, a single un-nested vector was produced. Each element held a complex value (one that, in mathematical terms, is comprised of a real and an imaginary component). This was a wise puzzle, and one that required further algorithmic exploration.

In correlating the values of definition with the resulting vector, there emerged three properties of the output vector that were required to discern the values of definition: the real value, the imaginary value, and the distance of the element from the beginning of the vector (its "origin"). Frequency was simple enough - it could be computed as a function of the distance of a given element from the origin...with the origin having a frequency of 0. Magnitude was a bit more complex, as it was derived from the square root of the sum of the squares of the real and imaginary components. Phase, however, proved to be the most complex of all, and required the imaginary to be divided by the real, and the trigonometric arctangent to be rendered from the result. From these three computations, the coefficient values for all of the vector generators were easily derived.

It was noted that the three values that could be derived from each element of the output vector bore a similarity to the "seed" values employed in our earlier work (wherein the "seeds" from the current vector could be fed back into the process in order to discern the function of the next nested "layer"). With this in mind, we took the output vector that was generated from the time-domain expression of the diesis and fed it back into the input of our new algorithmic "machine".

However, instead of giving us back the vector definition of the next "layer", the output was simply the time-domain expression of the diesis clock! This brought three things to light:

A new domain had been revealed - one that was a pure reciprocal complement to the time-domain within which we worked - the "spectral" domain. This was the domain of pure definition...that which embodied the template and the essence of that which was correspondingly manifest within the time-domain. Multiplication of this "spectral" domain by the time-domain yielded the pure expression of unity (1).

Our domain transformation "machine" came to be known simply as the "transform", and was equally well suited to the toroidal environments expressed in both the analog and digital domains. It was a synthesis of combinatorics and correspondences containing a fundamental template upon which its manner of transformation was based. Further exploration into this template revealed the fixed mathematical relationships upon which its transformational facility was based - "harmonic law".

The tenets of harmonic law proved remarkably simple. Each fundamental component - or "partial" - was a whole number multiple (x1, x2, x3, etc.) or reciprocal (÷1, ÷2, ÷3, etc.) of the whole. All of the harmonic functions were based upon reciprocation, where, for example, Nx2 would be functionally expressed as N÷÷2...or, in more conventional notation, N÷(1÷2). This reciprocation occurred up to the resolution of the expression, so that there was no limit to what could be expressed or transformed. The most basic declaration of harmonic law was that all of the pieces - the "partials" - fit perfectly within the whole...no space or overlap, just absolute balance.

The diesis clock proved to be an optimal expression of harmonic law. It contained a complement of all harmonics, beginning with the diesis ratio, and extending to fill the resolution of the time function. The progression of harmonics from the fundamental decreased in magnitude in perfect balance with their increase in frequency. No longer did we require the thousands of vector generators employed in our additive synthesis model. The transform itself, when moving from the spectral-domain back into the time-domain, provided the equivalent of as many or as few generators as would be needed optimally to fill the time and space of the function.

Any time-domain expression was an appropriate candidate for the transform, which served as a portal to its spectral-domain essence...and returning this essence to the transform would effectively return the manifestation of the time-domain expression. However, further analysis revealed that the resulting transformation back into the time-domain contained something more - instead of simply being the original static vector that had been the initial condition, the return time-domain vector was dynamic and complex. Exposure to the harmonic template embodied within the transform added components that had not previously had presence in the time-domain...and these components were the "automatous agents" of transformation. The time-domain expression had come to include the dynamic essence of the harmonic template as provided through the transformational process - an "embedded resource" which it could employ to effect change upon itself in accordance with harmonic law.

Please note that this is by no means a comprehensive treatment of the entire algorithmic journey that we have undertaken. There have been manifold peripheral discoveries and applications that are not addressed here. Instead, this document merely focuses upon those components of the journey that specifically led to the algorithmic rendering of the harmonic template itself.

Return to Great Diesis Archive
Frater Discipulus Suum Librum Legit (9)=[2]