A hero image for Project Echo, representing quantum concepts and advanced architecture.
A CLEVresearch Initiative

Project Echo

A utility-led architecture for quantum-resilient asset identity and risk surface control.

EVOLUTIONARY DISSIPATIVE CIRCUIT DESIGN (EDCD)

Non-Unitary Control via Topologically Bounded Chiral Ansatze

Gwendalynn Lim Wan Ting and Gemini

Nanyang Technological University, Singapore

December 6, 2025

ABSTRACT

Quantum control in Noisy Intermediate-Scale Quantum (NISQ) regimes is currently limited by the unitary gate model's inability to efficiently navigate high-dimensional Hilbert spaces under decoherence. We introduce Evolutionary Dissipative Circuit Design (EDCD), a non-Markovian control protocol that utilizes environmental noise as a topological resource rather than an error source. Central to this architecture is "Solitaire Logic" (Patience Sorting), a resource management strategy that treats the Hilbert Space not as a chaotic ocean, but as a deck of cards to be sorted. By grouping entangled states into ordered "Logical Stacks" and creating "Free Cells" for reuse, we demonstrate that the search space can be compressed from an exponential explosion to a manageable, sub-linear manifold. We further propose a Chiral Chimera Ansatz that exploits time-reversal symmetry breaking to induce a "Resonant Horn" topology, effectively shaving off stochastic noise via dissipative projection. Finally, we define a Turing State Selection mechanism at the classical-quantum interface, acting as a computability filter. This architecture enables robust state preparation by enforcing density-dependent normalization boundaries, turning the noise floor into a control surface.

I. INTRODUCTION: The Paradigm Shift

The realization of fault-tolerant quantum computing is hindered by the stochastic nature of environmental coupling. Standard Quantum Error Correction (QEC) seeks to isolate the system, treating decoherence as a destructive force. We propose an inversion of this paradigm: Dissipative State Engineering, where the environment is engineered to drive the system toward a target attractor state.
We postulate that the decoherence rate γ\gamma induces a local metric density on the Hilbert space, creating a topology analogous to a porous medium ("Sponge Topology"). Within this metric, we apply a Min-Max Normalization bounded by the noise floor B(γnoise)\mathcal{B}(\gamma_{noise}):
y^B(γnoise) \hat{y} \ge \mathcal{B}(\gamma_{noise})
By strictly bounding the prediction y^\hat{y} by the environmental noise floor, we prevent the system from exploring invalid, high-entropy subspaces.
1. Traditional vs. Dissipative Control Paradigms
This image replaces the "fighting decoherence" metaphor with a rigorous comparison of Unitary Dynamics (closed system, Lindblad terms) versus Dissipative State Engineering (open system, attractor manifolds, dark states).
Diagram comparing Traditional Quantum Control with EDCD: Dissipative State Engineering.
2. The Sponge Topology
A 3D visualization of the Hilbert space as a porous medium, where the decoherence rate dictates the pathways for quantum state evolution.
3D visualization of the Hilbert space as a porous medium.

II. THE MECHANICS OF DISSIPATIVE ENGINEERING: Filter, Sponge, and Activation

Conventional quantum control views the Hilbert space as a uniform, flat terrain where any deviation from unity is an error. EDCD fundamentally rejects this view. We recognize that decoherence induces a complex, non-uniform topology—a "bumpy" terrain where some regions are naturally protected while others are chaotic noise. To navigate this, we employ a three-stage physical mechanism that transforms environmental noise into a control resource.
A. The Sponge: Dissipative Normalization via Metric Density
We postulate that the decoherence rate γ\gamma is not constant but state-dependent, creating a local metric density on the Hilbert space. This forms what we term a "Sponge Topology."
  • High-Noise Regions (High Density): Areas with rapid decoherence are "dense" with environmental coupling. Like squeezing a dense sponge, the available state space here is compressed tightly. The system is physically restricted from exploring these high-entropy areas.
  • Low-Noise Regions (Pores): Areas with symmetry protection (e.g., decoherence-free subspaces) are the "pores" of the sponge—the "accepted solution space."
We apply Dissipative Normalization relative to this density. Instead of normalizing against a theoretical ideal, we normalize against the local noise pressure. This naturally compresses the vast Hilbert space, forcing the quantum state trajectory into the low-density "pores."
B. The Filter: Turing State Selection
Once the "Sponge" has defined the shape of the valid space, we need a mechanism to accept or reject the resulting states at the classical-quantum interface. We define a "Noise Floor Boundary" (B\mathcal{B}) derived from the sponge topology. The classical control system applies a Turing Selection test to the output vector v:
  • If $v \le \mathcal{B}$ (Below Noise Floor): The state is indistinguishable from environmental noise. Mathematically, it represents a non-halting or non-computable condition. The filter prunes this state.
  • If $v > \mathcal{B}$ (Above Noise Floor): The state has sufficient coherent signal to be computationally useful. It is accepted as a valid "Turing State."
Only these accepted Turing States are passed on to become the "piles/stacks" managed by the resource architecture.
C. The Activation: Hardware-Level ReLU
This filtering is not merely software post-processing; it is an inherent property of the noisy hardware. We establish a functional equivalence between environmental decoherence and the classical Rectified Linear Unit (ReLU) activation function. In a noisy regime, a quantum hardware physically cannot sustain a coherent signal below a certain threshold. The environment effectively "zeroes out" sub-threshold signals. Therefore, the "Activation" of the quantum neural network is hardware-gated. The network only "wakes up" and processes data when the quantum state survives the sponge topology and crosses the threshold defined by the filter.
The Quantum-Classical Pipeline
A visualization of the data flow from noisy quantum state to filtered classical signal.
Diagram of the quantum-classical data pipeline.

III. THE ROSETTA STONE: Architecture & Implementation Matrix

*To facilitate cross-disciplinary reproduction, we define the core mechanics via Logic, Physics, and Python implementation.*
THE LOGIC (Text)THE PHYSICS (Math)THE CODE (Python)
1. The Filter (Turing Selection)
We check if the quantum signal is strong enough to be "real." If it's below the noise floor, it is pruned from the primary calculation.
y^={Wvif v>Bhandle_pruned(v)if vB \hat{y} = \begin{cases} W \cdot v & \text{if } v > \mathcal{B} \\ \text{handle\_pruned}(v) & \text{if } v \le \mathcal{B} \end{cases} `python def turing_filter(vector, noise_floor): if vector.signal > noise_floor: return w_transform(vector) else: return handle_pruned_state(vector) # e.g., for Echo-Key
2. The Sponge (Normalization)
We squash the data into a valid shape based on the density of the noise holes. High noise density = tighter squeeze.
ynorm=ymin(y)max(y)min(y)(1γ)y_{norm} = \frac{y - \min(y)}{\max(y) - \min(y)} \cdot (1 - \gamma)`python def sponge_normalize(data, decay_rate): scale_factor = 1.0 - decay_rate return min_max_scale(data) * scale_factor
3. The Activation (Hardware ReLU)
The neural network only "wakes up" if the quantum state survives the Chiral spiral.
Activation=ReLU(WΨchiral)Activation = \text{ReLU}(W \cdot \Psi_{chiral})`python class QuantumLayer(nn.Module): def forward(self, x): x = self.chiral_evolve(x) return F.relu(self.w_matrix(x))

IV. RESOURCE ARCHITECTURE: The Solitaire Protocol

*By using the Sponge to compress the space and the Filter/Activation to define valid states, we transform an intractable exponential problem into a manageable one. We are no longer dealing with infinite chaos, but with discrete, valid units of entangled information. This allows us to introduce our resource management layer:*

To manage the computational cost of this density manipulation, we introduce a resource management architecture based on Patience Sorting (Solitaire Logic). In a standard quantum circuit, the controller attempts to track every qubit individually, leading to exponential complexity (O(2N)\mathcal{O}(2^N)). EDCD applies three specific Solitaire rules to compress this:
1. The Stack Principle (Compression): We group highly correlated (entangled) qubits into "Logical Stacks." The control system treats this stack as a single object. This collapses the dimensionality of the control problem—we are no longer moving 50 individual qubits; we are moving 7 "Stacks."
2. The Free Cell Heuristic (Reuse): We utilize Mid-Circuit Reset and Reuse. Once a "Stack" has served its purpose in a sub-routine, we measure it, reset it, and reuse those physical qubits. This allows us to run algorithms that theoretically require 1,000 qubits on a machine that physically only has 50.
3. The Look-Ahead Constraint (Pruning): We implement a Constraint Filter. The classical system analyzes potential circuit evolutions and "prunes" (discards) any branch that leads to a high-entropy "Dead End." We do not waste coherence time exploring paths that do not help "clear the board."
Topological Quantum Compression
A diagram showing the three core components of Topological Quantum Compression: Subspace Partitioning (The Stack Principle), the Qubit Reuse Protocol (Mid-Circuit Measurement), and Trajectory Pruning (Branch & Bound Filter).
A diagram showing the three core components of Topological Quantum Compression: Subspace Partitioning (The Stack Principle), the Qubit Reuse Protocol (Mid-Circuit Measurement), and Trajectory Pruning (Branch & Bound Filter).
The Solitaire Protocol
A visualization of the Solitaire Logic, treating quantum states as a deck of cards to be sorted into manageable "Logical Stacks."
Diagram illustrating the Solitaire Protocol for quantum resource management.

V. THE ANSATZ: Chiral Dissipative Chimera & Feynman Path Integrals

The mathematical partitioning of the Hilbert space into "Logical Stacks" requires a physical mechanism to create and maintain these stable, entangled subspaces amidst environmental noise. We achieve this using a Chiral Dissipative Chimera Ansatz.This ansatz is not a static variational form but a dynamic trajectory that evolves under a Lindblad Master Equation with broken PT\mathcal{PT}-symmetry. Its physical interpretation is best understood through the lens of Feynman's Sum-Over-Histories.
A. The Feynman Interpretation: Wiggly Lines as Quantum Histories
As visualized in Figure 4, the quantum state's evolution is not a single, clean path. According to Feynman's path integral formulation, the system simultaneously explores every possible trajectory between its initial and final states. In our noisy "Sponge Topology," these paths are not smooth curves but "wiggly," stochastic trajectories perturbed by environmental interactions.

Each "wiggle" represents a scattering event with a bath phonon or photon. In a standard system, these random phases would destructively interfere, leading to decoherence. However, our architecture is designed to exploit this.
B. Chirality for Error Immunity
We force the state trajectory into a Chiral (spiraling) path around the noise floor defined by the Turing filter. This "handedness" imposes a topological constraint on the sum over histories. Symmetric error paths, which would otherwise contribute to decoherence, are "twisted" out of phase and destructively interfere with themselves. Only the paths that conform to the chiral symmetry constructively interfere, creating a robust, protected "stack" of probability current.
C. Palindromic Compression via Resonant Horn Geometry
To further compress this sum of histories, the ansatz evolution creates a Tractrix (Resonant Horn) Geometry, acting as a dissipative "Mode Cleaner."

Figure 4: Visualizations of the core geometric and topological concepts of the Chiral Dissipative Chimera ansatz. The "Feynman Path Integral" shows the concept of sum-over-histories. The "Resonant Horn" acts as a mode cleaner.

Feynman Path Integral
A visualization of the Feynman path integral formulation, showing a particle exploring all possible paths (wiggly lines) between two points.

A visualization of the Feynman path integral formulation, showing a particle exploring all possible paths (wiggly lines) between two points.

The Resonant Horn (Conceptual Analogy)
This is a conceptual illustration and not based on actual matter. It acts as an analogy for how the Chiral Chimera Ansatz acts as a "Mode Cleaner," preserving the core signal while dissipating asymmetric error modes.

This is a conceptual illustration and not based on actual matter. It acts as an analogy for how the Chiral Chimera Ansatz acts as a "Mode Cleaner," preserving the core signal while dissipating asymmetric error modes.

VI. THE BRIDGE: Turing State Selection

A. The $W$-Transform as a Differentiable Proxy
A critical innovation of this work is the realization that the trained classical control network—specifically the $W$-Transform—acts as a differentiable proxy for the complex, noisy quantum Hilbert space. During training, the network learns the high-dimensional boundary of the "accepted solution space" defined by the Sponge Topology. Because this classical proxy is differentiable, we can invert its operation. We can start with a desired classical target output, y^target\hat{y}_{target}, and use standard gradient-based optimization techniques (like backpropagation) to find the input quantum vector, videalv_{ideal}, that maximizes the network's activation for that target.
B. The Logic: Turing State Selection
The system filters experimental results via a process we term Turing State Selection. This is the operational implementation of the logic defined in our Rosetta Stone (Table 1):
  • *Input:* The raw quantum vector vv, measured after evolution through the Chiral Ansatz.
  • *Test:* The classical system attempts to map this vector using the WW-Transform. This operation is equivalent to checking if the state satisfies the bounded normalization condition v>Bv > \mathcal{B}, where B\mathcal{B} is the noise floor boundary.
      Can this vector be mapped by the $W$-Transform?
    </li><li>*Action:*
    • If Yes: The vector is a "Turing State"—a computable, usable signal that exists within the accepted solution space. It is passed to the Solitaire protocol for resource management.
    • If No: The vector is indistinguishable from environmental noise. It represents a non-computable or "non-halting" state. The filter discards it, preventing high-entropy noise from propagating into the classical control logic.
    </li></ul>This mechanism ensures that the classical system only expends computational resources on states that have successfully navigated the dissipative topology, effectively turning the noise floor into a hard computational boundary.
The Turing State Filter
The Metaphor: The "Golden Cookie" Key
Only a key with the exact, correct pattern (a computable state) can unlock the classical interface.

Only a key with the exact, correct pattern (a computable state) can unlock the classical interface.

VII. THE CLASSICAL ADVANTAGE: Efficiency via Physical Regularization

A critical advantage of EDCD is the significant reduction in classical processing overhead, a direct consequence of shifting the burden of error management from software correction to physical regularization.
A. From Correction to Selection
Traditional Quantum Error Correction (QEC) requires massive classical resources to continuously measure syndromes and calculate corrections in real-time. EDCD eliminates this overhead. The classical system stops "fixing" individual errors (expensive) and starts "selecting" valid topological states (cheap). This is possible because the classical control network has already learned the "shape" of the valid solution space during training.
B. Physical Regularization as Pre-formatting
Environmental decoherence acts as a physical regularizer. By restricting the quantum state trajectory to the low-density pores of the "Sponge Topology," the environment effectively pre-formats the quantum data into a lower-dimensional manifold. The quantum system naturally explores the "path of least resistance" defined by the noise floor.
C. "Tracing the Groove"
Because the classical network's WW-transform is a differentiable map of this pre-formatted space, the prediction y^\hat{y} is reduced to a simple inference task. The classical computer does not need to simulate or correct the complex quantum dynamics; it simply "traces the groove" that has already been carved by the environment. The noise does the difficult formatting work physically, and the classical computer cheaply reaps the reward.

VIII. CROSS-DISCIPLINARY IMPLEMENTATION: The Quantum-Neural Interface

A key innovation of EDCD is the direct mapping of its physical architecture onto standard classical neural network components. This is not merely an analogy; it is a functional equivalence that allows the quantum system to be trained and optimized as a differentiable layer within a larger classical machine learning pipeline.
1. The Quantum Circuit as a Physical Embedding Layer
In classical deep learning, an embedding layer maps high-dimensional, discrete inputs into a lower-dimensional, continuous vector space, capturing semantic relationships. In EDCD, the quantum circuit performs this exact function, but physically.
  • *Input:* Classical Data (xx).
  • *Mechanism:* The data is encoded into an initial quantum state. This state then evolves through the dissipative "Sponge Topology" governed by the Chiral Ansatz. The environment acts as a physical feature extractor, naturally filtering out high-entropy components and preserving topologically robust features.
  • *Output:* A "noise-vector" (vrawv_{raw}) that encodes the topological features of the data as they have been shaped by the environment. This is a physically embedded representation of the classical input.
2. Decoherence as a Hardware-Level ReLU Activation
We establish a direct functional equivalence between Environmental Decoherence and the classical Rectified Linear Unit (ReLU) activation function, f(x)=max(0,x)f(x) = \max(0, x). The noise floor boundary B\mathcal{B} acts as the activation threshold.
  • Coherent Signal ($Signal > \mathcal{B}$): When the quantum state's coherent signal is strong enough to survive the environmental noise, it exists above the noise floor. The classical WW-transform can successfully map this state, projecting a non-zero output vector. This is physically equivalent to a ReLU neuron "firing" (outputting a positive value).
  • Decohered Noise ($Signal \le \mathcal{B}$): When the state is eroded by decoherence, its signal falls below the noise floor. The state is effectively "zeroed out" by the environment. The WW-transform projection collapses to a null or discard state. This is physically equivalent to a "dead" ReLU neuron (outputting zero).
  • Result: The neural network is physically prevented from activating on noise. The environment itself implements a Hardware-Gated Activation Function, ensuring that only coherent, meaningful signals are propagated through the network.
3. The Classical Network as a Differentiable Proxy
Because the classical control network (specifically the WW-transform) serves as a differentiable proxy for the quantum system. This is the crucial link that enables inverse design (Section VI). We can backpropagate error signals from the classical output through the WW-transform to optimize the input quantum parameters, effectively "training" the quantum circuit using classical gradient descent.
Hardware-Gated ReLU
The Metaphor: A Switch That Only Flips for a Strong Signal
The system is physically prevented from activating on noise; only a strong, coherent signal can "flip the switch".

The system is physically prevented from activating on noise; only a strong, coherent signal can "flip the switch".

IX. PROJECT ECHO-KEY: THE ROSETTA STONE IN ACTION

To demonstrate the practical power of the EDCD architecture, we present Project Echo-Key, a quantum identity authentication protocol. This application directly leverages the Turing State Selection and dissipative normalization mechanisms defined in our core "Rosetta Stone" to create a secure, evolving, and device-specific security key.
The architecture has a fundamental duality. The forward path uses physics to filter data, but the reverse path uses that same filtering logic to gain deep insights. This is the core principle behind Project Echo-Key. When the Turing Filter "prunes" data that falls below the noise floor, it's not just discarding random junk. It's discarding a very specific *pattern* of junk that is unique to that device's physical imperfections. By collecting and analyzing this "pruned" data—the "Negative Space"—we can reverse-engineer the hardware's unique identity.
The following table maps the general EDCD principles to the specific operational steps of the Echo-Key protocol.

TABLE 2: The Echo-Key Implementation Matrix

THE LOGIC (Echo-Key Operation)THE PHYSICS (Device-Specific Math)THE CODE (Python Implementation)
1. Enrollment (Fingerprinting)
We measure the device's unique noise pattern. States that fall *below* the noise floor define its "fingerprint."
Fingerprint={viviBdevice}Fingerprint = \{ v_i \mid v_i \le \mathcal{B}_{device} \}
*(Collect the discarded states)*
`python def enroll_device(quantum_data): noise_floor = measure_baseline() fingerprint = [v for v in quantum_data if v.signal <= noise_floor] return fingerprint
2. Evolution (The Living Key)
The "Master Key" is not static. It is a quantum state that evolves over time along a predictable chiral path.
Key(t)=eLchiraltKey(0)\ket{Key(t)} = e^{\mathcal{L}_{chiral}t} \ket{Key(0)}
*(Time-evolution via Lindblad eq.)*
`python def evolve_key(initial_key, time_t): lindblad_op = get_chiral_operator() evolved_key = quantum_evolve(initial_key, lindblad_op, time_t) return evolved_key
3. Authentication (The Challenge)
To log in, the device's *current* noise pattern must match the *predicted* state of the evolving key.
Auth={Grantif Sim(Fnow,Key(t))>ThreshDenyif Sim(Fnow,Key(t))ThreshAuth = \begin{cases} \text{Grant} & \text{if } \text{Sim}(F_{now}, \ket{Key(t)}) > \mathcal{T}_{hresh} \\ \text{Deny} & \text{if } \text{Sim}(F_{now}, \ket{Key(t)}) \le \mathcal{T}_{hresh} \end{cases}`python def authenticate(current_fingerprint, evolved_key): similarity = calculate_similarity(current_fingerprint, evolved_key) if similarity > AUTH_THRESHOLD: return "Access Granted" else: return "Access Denied"

XI. CONCLUSION & OUTLOOK: Toward Algorithmic Proofs for Dissipative Quantum Security

The Evolutionary Dissipative Circuit Design (EDCD) architecture establishes a new paradigm where environmental decoherence is transformed from a stochastic error source into a topological resource for control and computation. By introducing dissipative normalization and Turing State Selection, we have demonstrated a mechanism to define and navigate a bounded "sponge topology" within the Hilbert space, enabling efficient resource management and novel ansatz design.
Beyond its immediate application to control, this framework has profound implications for the foundations of quantum security. Current Quantum Key Distribution (QKD) protocols rely on the fundamental laws of physics to guarantee security. However, the practical implementation of these protocols on NISQ devices is vulnerable to device-specific imperfections and side-channel attacks.
We propose that the methodologies established herein—specifically the ability to rigorously index, classify, and pattern-match unique decoherence profiles using our "Turing Filter" and "WW-Transform" mapping—provide the necessary theoretical tools to construct algorithmic proofs for device-dependent quantum security.
By treating a device's unique "decoherence mold" as an unforgeable, physically-clonable function (PUF), EDCD allows for the derivation of security proofs that are intrinsically tied to the specific hardware implementation. This shifts the security basis from ideal theoretical assumptions to verifiable physical reality. The ability to reverse-engineer information from both the pruned noise manifold and the compressed solution space suggests a pathway to rigorous, mathematically provable authentication protocols, foundational to the next generation of quantum-secure infrastructure.

Addenda