# Computational Landscape Architecture ## Surfacing Physical Intelligence for Design Intervention in Post-Fire Debris Flow Flooding **GSD Seminar Proposal | Spring 2026** **Instructors:** Craig Douglas, Assistant Professor of Landscape Architecture, Harvard GSD Stephen Guerin, Senior Teaching Associate, Harvard GSD & Visualization Research and Teaching Laboratory, Earth and Planetary Sciences --- # The Stream Table Laboratory ## Course Description This seminar leverages and extends GSD's Stream Table as a physical computational laboratory where landscape processes execute in accelerated time. Students observe, measure, and ultimately intervene in a system that computes its own form—developing fluency in working *with* landscape computation rather than imposing design ignorant of it. The stream table makes visible what field conditions typically hide: the bidirectional transaction between flow (potential field) and substrate (constraint field), the emergence of channel networks, the phase transitions between erosion regimes, the path-dependence of form. Students witness the landscape computing least-action configurations in real time. For post-fire applications, the stream table allows rapid prototyping of intervention strategies: What happens when you remove vegetation (burn the constraint field)? Where do debris flows channelize? Which barrier placements redirect flow vs. amplify scour? Students test hypotheses in minutes that would take years to observe in the field. ## Learning Objectives By semester end, students will be able to: 1. Observe and record landscape computation using timelapse, photogrammetry, and particle tracking 2. Identify phase transition indicators (variance, correlation length, memory) in physical systems 3. Derive order parameters from observation and couple them to control parameters 4. Use AI/diffusion models to estimate extensive form from intensive gradients 5. Diagram trans-actional systems showing dual-field coupling 6. Design interventions that "script" computational outcomes by editing fields rather than specifying forms 7. Build portable observation systems using commodity hardware (phone + projector) ## Curriculum | Week | Theme | Methods | Deliverable | |------|-------|---------|-------------| | 1-2 | **Observation** | Timelapse capture, baseline runs, visual pattern recognition | Observation log: what does the table compute? | | 3-4 | **Measurement** | Photogrammetry, particle image velocimetry, flow visualization | Velocity and elevation field maps | | 5-6 | **Order Parameters** | Derive metrics: channel sinuosity, braiding index, sediment flux, variance | Time series of order parameters across runs | | 7-8 | **Control Parameters** | Systematic variation: slope, discharge, sediment feed, substrate | Parameter-order coupling diagrams | | 9-10 | **AI Interpretation** | Train models to predict form from gradients, detect phase transitions | Diffusion model: intensive → extensive | | 11-12 | **Intervention Design** | Propose field edits (barriers, substrate, flow regime) to script outcomes | Intervention hypothesis + test protocol | | 13-14 | **Scripting** | Execute interventions, compare predicted vs computed form | Final documentation: the table as collaborator | ## Relationship to Existing LA Curricula Traditional landscape architecture education emphasizes: | Traditional Approach | This Seminar | |---------------------|--------------| | Static site analysis | Dynamic process observation | | Form-making as authorship | Design as parameter tuning | | Representation as rendering | Representation as execution log | | Implementation as construction | Implementation as runtime intervention | | Success = matches the drawing | Success = system computes desired regime | This seminar complements studio sequences by providing a controlled laboratory where students can *fail fast*—testing hypotheses about landscape behavior in minutes rather than decades. It extends technology curricula (GIS, computational design) by grounding digital tools in physical intuition about how landscapes actually compute. The course bridges to: - **Hydrology/ecology coursework**: provides physical intuition for process models - **Representation seminars**: timelapse and AI as new modes of landscape drawing - **Theory seminars**: embodied engagement with complexity, emergence, self-organization - **Studio projects**: transferable framework for reading sites as computational substrates --- # AI Recovery of Computational Substrate The stream table is a laboratory for developing AI methods that infer the computational substrate from observation. The learned methods transfer to field observation of actual landscapes—including post-fire watersheds where rapid assessment is critical. ## From Trajectories to Potential Fields Particle tracking gives velocity fields. Velocity is the gradient of a potential (approximately, for shallow flow). Integrate observed velocities backward to recover the effective potential surface the water "thinks" it's descending. Compare to measured topography—the difference reveals where substrate friction, channel geometry, and flow depth modify the naive gravitational potential. You're recovering the *effective action functional* the system is minimizing. ## From Form Evolution to Constraint Field Dynamics Photogrammetry gives elevation sequences. The difference between successive frames is erosion/deposition rate—a proxy for shear stress exceeding threshold. Train a model to predict Δz from local observables (slope, curvature, flow depth, upstream contributing area). The learned function *is* the constraint field update rule: how does the substrate respond to the potential field's forcing? ## From Ensemble Statistics to Phase Structure Run the table many times with identical initial conditions. Observe divergence. Where trajectories cluster, the system is in an attractor basin. Where they scatter, you're near a bifurcation. Map variance across the table surface and through time—high variance regions are where the system is "deciding," near criticality. The AI learns to identify decision points without being told the physics. ## Diffusion Models as Score Function Learners A diffusion model learns ∇ log p(x)—the gradient of the log probability of configurations. Train on observed stream table forms. The learned score function tells you: which direction in configuration space is "more probable"? That's the system's implicit action gradient. You've recovered the variational principle without writing equations. ## Order Parameter / Control Parameter Coupling Systematically vary control parameters (slope, discharge, sediment feed). Measure order parameters (sinuosity, braiding index, channel width, variance). Learn the mapping. The shape of this coupling surface reveals the system's phase diagram— where are the transitions? Where is hysteresis? The AI doesn't know geomorphology; it discovers phase structure from covariation. ## Attention as Salience Detection Train a transformer on frame sequences to predict next frame. Examine attention weights. Where does the model look to make predictions? Attention concentrates on the salience landscape—the regions where current state most constrains future evolution. The model learns what the table "cares about" without being told. ## Counterfactual Intervention Learning Place barriers, change substrate, alter flow. Observe response. Train a model to predict intervention effects. The learned model encodes the system's causal structure—which field edits propagate how far, how fast, through what pathways. You're recovering the coupling topology. ## Separating Substrate from Script The AI learns to distinguish: - **Computational substrate**: the invariant transaction structure—how potential and constraint fields couple, what update rules govern their interaction - **Emergent script**: the particular trajectory determined by initial intensive gradients, boundary conditions, and control parameters The substrate is *what kind of computation* this landscape performs. The script is *which computation* it performs given specific initial conditions. Design intervenes at both levels: editing the substrate (changing what's computable) or editing the script (changing what gets computed). For post-fire response: the substrate (coupling rules) may be relatively stable across burned watersheds. The script (where debris flows go) depends on specific topography, burn severity pattern, rainfall distribution. AI trained on the stream table learns to separate these—enabling rapid assessment of new sites by recognizing substrate type and reading initial conditions. --- ## The Urgency Post-fire landscapes are computational systems in crisis. Burned watersheds don't wait for design review cycles—debris flows mobilize within minutes of intense rainfall, channelizing through landscapes whose constraint fields (vegetation, soil structure, root networks) have been catastrophically altered while potential fields (rainfall, slope, accumulated sediment) remain unchanged or amplified. Current response fails a critical population: approximately 60% of post-fire recovery and debris flow response comes from citizens themselves—residents, neighbors, community members making real-time decisions with inadequate information. They cannot interpret probability maps or model outputs. They need to *see* how their landscape computes risk. This seminar develops methods for surfacing the physical intelligence already operating in damaged watersheds, enabling both professional designers and citizen responders to work *with* landscape computation rather than imposing interventions ignorant of it. --- ## The Lens Computational Landscape Architecture reconceptualizes landscapes not as static compositions to be designed, but as active computational substrates continuously executing their own algorithms. The core model posits two reciprocal fields in bidirectional negotiation: - **Potential field**: carries energetic drive (heat, moisture, momentum, gravitational head, resource gradients) - **Constraint field**: encodes structure, boundaries, and memory (topography, vegetation patterns, soil structure, built form) These fields engage in iterative trans-action—each field's gradient driving changes in the other—until they reach least-action configurations that manifest as observable landscape forms: channels, ridges, vegetative mosaics, fire scars, desire lines, debris fans, urban patterns. Fire doesn't destroy the computation; it violently edits the constraint field while leaving potential fields intact. The system recomputes, often catastrophically. ## Dynamical Regimes The model distinguishes three regimes: | Regime | Character | Computation | |--------|-----------|-------------| | **Frozen / Too Ordered** | Locked in basin, ignores inputs | None—system is perceptually dead | | **Chaotic / Critical** | Deterministic but sensitive, bounded | Turing complete—system perceives, computes, remembers | | **Noise / Too Unordered** | Random, structureless | None—no memory, no propagation | **Chaos is the target, not the hazard.** At criticality: - Small inputs can produce large outputs (maximum leverage) - The system is maximally sensitive to perturbation (it *perceives*) - Information transfer is maximized (the system processes more bits) - The system is Turing complete—it can compute ## Nature Tunes Itself Self-organized criticality is the tendency. Complex systems with feedback migrate toward the edge because that's where they're maximally adaptive. Hübler's self-adjusting logistic map experiments demonstrate this: systems with slow parameter adjustment based on their own dynamics migrate to the boundary between periodicity and chaos without external direction. The problem is human intervention that *prevents* self-tuning: - Fire suppression locks forests into fuel-accumulation basins - Channelized rivers freeze into engineered corridors - Monocultures eliminate variance - Flood control kills the hydrological pulse - Zoning prevents urban self-organization Each intervention removes feedback, decouples the dual fields, freezes the transaction. The landscape would find criticality if we stopped preventing it. ## Phase Transition Indicators as Design Diagnostics Students learn to read these signatures—not as warnings, but as targets: - **Variance increasing**: perceptual aperture widening, system loosening from fixed point - **Critical slowing down**: system integrating more before responding, "listening" - **Long-range correlation**: distant parts talking, coherence emerging across the whole system - **Power law / fractal signatures**: scale-free structure, no privileged scale, information flows across all levels - **Memory deepening**: path dependence strengthening, history encoded in structure These indicators guide intervention: nudge frozen systems toward criticality so they can compute. The goal is not to model, predict, or control—but to restore the system's capacity to compute its own solutions. ## Designing for Least-Action Path Selection You don't design paths. You design the fields that make desired paths least-action. The landscape is already computing trajectories—water, fire, feet, animals, sediment all select paths that minimize action given current field configuration. Intervention isn't specifying routes but editing potential and constraint fields so that desired paths *become* the paths that cost least effort. **Traditional design:** Draw path → build path → enforce path (signage, barriers, maintenance against desire lines) **Computational design:** Read what paths the system is computing → identify divergence between desired and computed → edit field parameters (gradient, friction, permeability, visibility) → let the system re-solve The desire line isn't failure—it's the system reporting what's actually least-action. Fighting it is fighting the computation. The design move: what field edit would make the designed path become the desire line? In post-fire contexts: the debris flow will take the least-action path through the burned watershed. You don't armor the predicted channel; you edit the field so the least-action path avoids structures. Design becomes: specify the action functional, not the trajectory. ## Transaction as the Fundamental Design Unit The irreducible unit isn't path or form—it's the trans-action. Two fields in bidirectional negotiation until mutual least-action equilibrium. | Process | Potential Field | Constraint Field | Trans-action | |---------|-----------------|------------------|--------------| | Hydrology | Gravitational + pressure head | Channel geometry, roughness | Water shapes channel; channel shapes flow | | Fire | Heat, fuel energy | Moisture, topography, arrangement | Fire transforms structure; structure transforms fire | | Debris Flow | Slope, accumulated mass, rainfall | Vegetation, soil cohesion, channel confinement | Flow erodes banks; banks channelize flow | | Pedestrian | Destination attraction, momentum | Surface, slope, obstacles | Feet wear paths; paths channel feet | | Ecological | Resource gradients | Species composition, soil | Organisms modify habitat; habitat selects organisms | The designer's work: 1. **Recognize the transaction.** Find salience landscapes where bidirectional coupling is actively computing. Phase transition indicators reveal where the system is in dialogue with itself. Dead zones have decoupled fields. 2. **Identify the dual.** Every transaction has conjugate descriptions—Voronoi (territory, catchment, belonging) and Delaunay (network, connection, flow). Every -shed has a dual network. Knowing both lets you choose which representation makes intervention legible. 3. **Design the coupling, not the fields.** Leverage isn't in either field alone—it's in how they talk. Strengthen feedback: transaction tightens toward criticality. Weaken it: fields decouple, computation dies. The acequia gate doesn't change water or field—it modulates the transaction between them. --- # Decentralized Observation Systems ## Commodity Hardware as Instrument The seminar teaches students to build observation systems from phones and projectors: - **Structured light without specialized hardware.** The projector casts known patterns—stripes, grids, Gray codes. The phone captures pattern distortion. Depth inference from single camera becomes tractable. - **The phone as complete instrument.** Capture, compute, store, transmit, display, cast. The projector becomes a dumb light source; intelligence lives in the pocket. - **Bidirectional flow.** Phone casts patterns *to* projector, captures reflections *from* surface. The instrument is a transaction—outbound structured light, inbound structured observation. - **Field deployability.** Students learn a method they can carry anywhere. Stream table this semester; actual stream next summer; post-fire watershed next emergency. - **Repair and replacement.** Phone breaks? Get another phone. Projector fails? Borrow one. No vendor lock-in, no calibration certificates. The system is robust because components are commodity. - **Pedagogical transparency.** No black box depth sensor. Students cast the pattern, capture the image, inspect the reconstruction code. The instrument is legible. This reduces friction. Specialized equipment creates priesthoods. Commodity equipment creates communities. ## From Lab to Landscape The stream table is the training ground. The fireshed is the deployment. **From structured light to structured time.** In the lab, the projector provides spatial structure. In the field, the sun provides temporal structure—known motion (shadows, illumination angle) that makes multi-camera geometry tractable. **Sparse cameras, dense inference.** You don't need continuous coverage. You need enough viewpoints that the computational landscape's state is recoverable. A watershed changes at salience points: channel heads, knickpoints, confluences, debris fan apexes. Sparse cameras at high-information locations capture disproportionate signal. The AI learns where to look from the stream table; the field network inherits that attention. **Sousveillance inverts power geometry.** Surveillance watches from above— satellites, aircraft, institutional infrastructure. Sousveillance watches from within—residents, hikers, ranchers, evacuees. The phone is already in the landscape, carried by people who live there. The observation network is the community, instrumented. **Viewshed as primitive.** Every phone has a viewshed—the terrain it can see from its position. Overlapping viewsheds create coverage. Gaps in coverage are legible. The phone network computes its own observability, reports its own blind spots. ## Geometric Coordination Without Central Control **Geo-registered pixels as shared coordinate system.** Every camera knows where it is (GPS), knows its orientation (IMU + landmark registration), knows its lens geometry (calibration). Every pixel becomes a ray in world coordinates. Different cameras looking at the same terrain point produce intersecting rays. No central server required; geometry is the protocol. **Epipolar constraints as cooperation primitive.** Camera A sees a feature. Camera B's matching feature must lie on the epipolar line determined by the two cameras' poses. This constraint is computable locally—each camera pair can verify correspondence without global coordination. Distributed triangulation emerges from pairwise consistency. The network self-organizes 3D reconstruction the way the stream table self-organizes channels. ## Time as Interface **Timestamp everything.** Now any citizen can scrub. The interface isn't a dashboard of model outputs—it's a DVR of the actual event. Drag the slider: watch the debris flow develop, watch the flood pulse move down channel, watch the smoke column development. The landscape's computation is directly observable, not abstracted into risk polygons and probability contours. **Risk as direct perception.** The formal model says "you are in a high-risk zone." The citizen asks: why? Based on what? The model can't show its work in terms the citizen can verify. But scrubbing time on actual observation—watching how fast the flow moved through similar terrain, watching which drainages channelized, watching where deposition occurred—gives the citizen *their own evidence*. They assess, not defer. **Gibson over dashboards.** Ecological psychology applied to emergency management. The affordance is directly perceivable if you can see the landscape behaving. You don't need debris flow equations to see that the channel is mobilizing faster than you can evacuate. The perception is immediate, not inferential. **The unfolding event as its own explanation.** Scrub back: how did we get here? The causal history is visible. Scrub forward (with model assist): where might this go? But the forward projection is marked as projection—the past is observation. Citizens learn to distinguish what they're seeing from what they're inferring. Epistemic hygiene through interface design. ## Community-Governed Observation Commons **The acequia model of sensing.** Traditional acequias allocate scarce water through community governance—parciantes take turns, share maintenance, resolve disputes locally. A distributed camera network is the same structure: scarce attention (bandwidth, storage, processing) allocated across community-owned sensors. Who decides what gets watched? How is bandwidth allocated during an event? How do you govern a shared observation commons? The fireshed camera network is a commons. Design it like one. **Models complement, don't replace.** Formal forecasting still matters—SimTable simulations, ensemble weather models, debris flow routing. But these inform experts making infrastructure decisions (preposition resources, stage evacuations). The citizen interface is different: here's what's actually happening, here's how it's been behaving, here's what you can see for yourself. Trust is grounded in observation, not authority. **Coupling observation to simulation.** SimTable deployments compute forward (what might happen). Camera networks compute backward (what is happening). Couple them: real-time data assimilation where the landscape's actual computation corrects the model's predicted computation. The citizen sees both—observation and projection—clearly distinguished. You're not democratizing model outputs. You're democratizing observation. --- # What This Model Makes Useful All models are wrong. This one earns its keep by: **Redirecting attention from form to process.** Forces the question "what is this landscape *doing*" before "what does it look like." **Suggesting different intervention points.** If design is runtime editing: What parameters can I tune? What feedback loops can I amplify or dampen? What happens in year 10? **Creating disciplinary interfaces.** Physics/computation language provides common ground with hydrologists, fire behavior modelers, debris flow scientists, ecologists—people who already think in flows and gradients. **Making tool requirements legible.** If you believe the model, you know what to measure: temporal frequencies beyond human perception, dual-field proxies, phase transition indicators. **Connecting design to governance.** If landscape is shared computational infrastructure, Ostrom-style questions arise: Who edits the running program? How do you govern access to the computation? **Reframing success metrics.** Not "did I get the outcome I specified?" but "did I maintain conditions for the system to find outcomes I couldn't have specified?" **Serving the 60%.** If most post-fire response comes from citizens, then observation systems must be citizen-operable and observation must be citizen-legible. Professional designers create conditions; communities inhabit and maintain them. ## What This Model Doesn't Do It doesn't predict. It doesn't optimize. It doesn't give you the answer. It gives you a way to ask: Where is computation happening? What's preventing it? What transaction am I intervening in? What's the dual? How do I tune the coupling so the system can find its own solutions? The landscape computes. Citizens can see it computing. Design becomes collaboration with process—surfacing physical intelligence rather than overwriting it. The landscape computes. You curate the phase space.