#
**Short Intro: Interference Networks**

**Asumptions**

Describing strong *inhomogeneous* wave spaces with many locations of wave source points or/and destinations laying everywhere in *n*-dimensional wave space (optics) or in colored wave space (nerve nets, high frequency circuits), classical wave equations basing on linear network theory find limits. They are mostly not "clean" applicable (remeber the problems of acoustic beamforming methods applied to non-uniform mic-arrays).

Linear network theory describes equilibrium processes within a network that can be solved with linear differential equations. If such processes don't occure on single places with infinite speed they get wave character.

If we follow the path of a wave, we use a discretisation method basing on "wave conducting wire lines". We call such nets "Interference Network" (IN).

**Structure**

Any *n*-dimensional IN consists of branches and nodes. Each node has a *n*-dimensional spherical coordinate. Time functions propagate (slowly) through the net. Any branch has a delay or a set of delays for different time function types and a weight or a set of weights.
Any time function, flowing through a branch, is delayed by the branch delay and partialy damped by a factor or function. Nodes can combine time functions of different branches (AND, OR, XOR, add, multiply, integrate, fuzzy...).
If supposed, branches have a velocity, the length to width ratios of branches (nerve) correspond with delays: Following, a three-dimensional IN- structure creates a corresponding processing structure (advanced filter theory, space-time filter, computers).
In opposite to the common "electric node abstraction" - used for electric nets, control theory, state machines, artificial neural networks - *it is forbidden*, to propagate any signal *infinite fast*. Any signal flow has finite velocity. Hence, to bridge any smallest distance we need a (delaying) branch.
On the other side, if a signal bridges a distance, it needs time. (The definition implies, that clocks (if used) have also delays.) Consequently, it is forbidden to move "artificial" the locations of nodes (difference to Artificial Neural Nets ANN) - the interference structure will be destroyed. Thus, an *interference code* (or nerve code) is defined by the delay structure between nodes (-> delay mask, delay vector) and the corresponding weight- and operator- structure (synaptic coefficients in nerve, type of conjunction).

**Properties**

Between channel number *k* and dimension *n* exists an optimum match condition: *k = d+1*. The real relation between channel number and physical dimension defines over-conditioning restrictions (axial near sharpness in optics, no limitation for time-inverted reconstruction). We use time- or delay inversion for technical solutions of the over-conditioning restrictions.
Data addressing in IN uses specific forms of coherence of signals flowing over different branches. We subdivide between self-interference and cross-interference. Coherence carries images or correlation maps.
Using a set of channel data (time functions at different locations) the interest can focus to the *reconstruction* of a generating excitement map or the *projection* into a detecting field.

**History**

Historically, interference nets were born 1993 as physical enhanced "Neural Networks" by spatial coordinates and time axis. Such nets generate different new properties (especially mirroring maps) only by enhanced model precision.
First application of an interference network was the invention of (passive) acoustic photo- and cinematography (Acoustic Camera) with negative delay or/and time inversion in the years between 1994 and 96.

We find interference networks as the missing link and mathematical abstraction between electric network theory, control theory, wave-optics (lens-projection), wave-acoustics (beamforming), sonography, electrical tomography (Radar, MRT...), antenna theory, (space-time-) filter theory, digital signal processing and nerve network theory.