# Web Map to Equirectangular Projection System ## For Harvard Science Sphere Display **Team:** Joshua Widdicombe, Andreas Plesch, Stephen Guerin, Devon Bryant --- **Context:** This proposed approach complements Joshua Widdicombe's established OBS/MadMapper workflow for sphere display. Both solutions work together depending on content type and use case. --- ## Overview A browser-native system to display live geospatial web maps on a spherical projection surface. The system captures web maps (Leaflet, OpenLayers, MapLibre, etc.), detects their projection, and transforms them in real-time to equirectangular format using WebGL fragment shaders. --- ## Approaches for Sphere Content Delivery ### Joshua's OBS/MadMapper Workflow (Primary Pipeline) **Potential use cases:** - **Mixed media sphere content**: Combining maps with video, images, animations on the sphere - **Live sphere presentations**: Real-time mixing during lectures, events, demonstrations - **VJ events on the sphere**: Live performance mixing, transitions, effects for sphere shows - **Multi-source sphere composition**: Layering multiple inputs (maps + data overlays + graphics) - **Curated sphere shows**: Pre-produced content sequences for specific audiences - **Event-driven content**: When sphere display needs live control and transitions - **NDI/Spout ecosystem**: When integrating other production sources to the sphere **Advantages:** - Mature, stable production tool - Joshua's expertise and tested workflows for sphere display - Handles any content type (not just maps) - Hardware acceleration via GPU - Andreas's proven shader implementation - Full control during live presentations and performances ### Browser-Native WebGL Solution (Complementary) **Potential use cases:** - **Automated sphere operation**: 24/7 playlist rotation in Cabot Library, museums - **Map-focused sphere content**: When geospatial visualization is the primary need - **Rapid prototyping**: Testing new map sources on sphere without OBS scene setup - **Unattended installations**: Library/museum displays that run without operator - **Direct web integration**: Live-updating web content (IRIS, Nullschool, etc.) - **Remote sphere locations**: Where full OBS rig may not be practical - **Development/testing**: Iterating on projection math and map sources **Interactive capabilities (HCI):** - **Presence detection**: Browser computer vision to detect people approaching the sphere - **Audience awareness**: Count viewers, adjust content based on crowd size - **Mobile browser control**: Visitors control sphere content from their phones - **User-submitted URLs**: Allow visitors to load URLs on the globe during their visit - **Moderated content pipeline**: Visitor submissions reviewed before adding to default playlist - **Class/student playlists**: Students or classes load their own custom playlists for presentations - **Dwell-time adaptation**: Change content pacing based on how long people stay **Advantages:** - Direct access to map APIs for precise control - Lower latency (no screen capture intermediate) - Simpler for non-technical content updates (edit JSON playlist) - Can run on any device with modern browser - Easy to script and automate for long-term installations - JavaScript ecosystem enables rich interactivity and sensing ### Hybrid Workflow: Best of Both **Browser solution feeds OBS for sphere:** ``` Browser WebGL Output ↓ (via Spout/NDI) OBS as mixer/compositor ↓ (via Spout) MadMapper sphere calibration ↓ Science Sphere Display ``` **Use cases:** - Browser handles map capture and reprojection to equirectangular - OBS adds overlays, titles, transitions between browser and other sphere content - MadMapper handles final sphere geometry and calibration **Or: Parallel deployment for sphere:** - **OBS/MadMapper for active use**: Events, classes, curated sphere presentations (Joshua's workflow) - **Browser solution for passive display**: Automated sphere content in library between events - Both output equirectangular to same sphere, switched via input selection or schedule --- ## Architecture ### Browser-Native WebGL Pipeline ``` Browser Window (fullscreen/iframe) ↓ WebGL context captures map viewport ↓ Fragment Shader transforms projection ↓ Output texture renders to canvas ↓ Spout/NDI/Direct output → MadMapper → Science Sphere ``` ### Browser + WebGL Capabilities - **Hardware-accelerated rendering** via WebGL on the GPU - **Direct map API access** to manipulate map objects and their render contexts - **Real-time shader pipeline** with live parameter updates - **Native texture sampling** directly from map's WebGL/Canvas context - **JavaScript ecosystem** for interactivity, sensing, and scripting - **Lightweight deployment** - runs in any modern browser - **Web standards** - leverages established APIs (WebGL, Canvas, WebRTC, etc.) --- ## System Components ### 1. Playlist Configuration (JSON) ```json { "playlist": [ { "name": "IRIS Seismic Monitor", "url": "https://iris.edu/app/seismic-monitor/map", "duration": 300, "projection": "mercator", "map_library": "leaflet", "manipulation": { "zoom": 1, "center": [0, 0], "bounds": [[-85, -180], [85, 180]], "disable_wrap": true }, "css_hide": [ ".leaflet-control-container", ".attribution", ".leaflet-control-zoom" ], "wait_for_load": 5000, "refresh_interval": 4200 }, { "name": "Nullschool Wind", "url": "https://earth.nullschool.net/#current/wind/surface/level/equirectangular", "duration": 180, "projection": "equirectangular", "skip_shader": true, "css_hide": [".attribution", ".core-ui"] } ], "transition": { "type": "crossfade", "duration": 2.0 }, "output": { "resolution": [2048, 1024], "method": "spout" } } ``` ### 2. Map Detection & Introspection Module **Auto-detect map library:** - Scan for global objects: `L` (Leaflet), `ol` (OpenLayers), `maplibregl`, `mapboxgl`, `Cesium` - Check DOM for container classes/IDs - Inspect canvas/WebGL contexts - Parse map metadata **Extract map state:** - Current projection (EPSG:3857, 4326, custom) - Zoom level and bounds - World wrap settings - Tile loading state - WebGL context reference **Library-specific APIs:** ```javascript // Leaflet const map = L.map._instances[0]; map.setView([0, 0], 1); map.setMaxBounds([[-85, -180], [85, 180]]); // OpenLayers const map = document.querySelector('.ol-viewport').__map; map.getView().fit([-180, -85, 180, 85]); // MapLibre const map = maplibregl.Map._instances[0]; map.setCenter([0, 0]).setZoom(0); ``` ### 3. Map Manipulation Module **Viewport control:** - Force zoom to full world extent - Center at [0, 0] - Set bounds to exactly -180° to +180° longitude - Handle -85° to +85° latitude for Web Mercator - Disable user interaction (pan, zoom, rotate) **UI cleanup:** - Inject CSS to hide controls, attributions, overlays - Remove event listeners - Disable animations if needed **Tile loading:** - Wait for complete tile load - Poll or use map library events - Timeout fallback ### 4. WebGL Capture & Transform **Input capture methods:** - **Option A**: Read from map's existing WebGL/Canvas context - **Option B**: Render map to offscreen canvas - **Option C**: Use `captureStream()` for video **Shader pipeline:** ``` Source Canvas/Texture ↓ Sample in Fragment Shader ↓ Apply projection transform (Mercator → Equirect) ↓ Output to fullscreen quad ↓ Display canvas or stream ``` **Fragment Shader** (Andreas's HLSL adapted to GLSL): ```glsl // Web Mercator to Equirectangular // Input: texture in EPSG:3857 (Web Mercator) // Output: EPSG:4326 (Equirectangular) uniform sampler2D mapTexture; varying vec2 vUv; const float PI = 3.14159265359; void main() { // Output pixel's latitude in equirectangular float lat = (vUv.y - 0.5) * PI; // -PI/2 to +PI/2 // Convert latitude to Web Mercator Y float mercatorY = log(tan(PI / 4.0 + lat / 2.0)) / PI; mercatorY = (mercatorY + 1.0) / 2.0; // Normalize to 0-1 // Longitude passes through (both cylindrical) float lon = vUv.x; // Sample from Mercator texture vec2 mercatorUV = vec2(lon, mercatorY); // Clamp to valid Mercator range (~85°) if (mercatorY < 0.0 || mercatorY > 1.0) { gl_FragColor = vec4(0.0, 0.0, 0.0, 1.0); // Black outside bounds } else { gl_FragColor = texture2D(mapTexture, mercatorUV); } } ``` ### 5. Playlist Sequencer **State machine:** - Load URL in iframe or main window - Wait for page load + map initialization - Run detection and manipulation - Start WebGL transform - Display for specified duration - Transition to next source **Timing control:** - Timer-based advancement - Smooth transitions (crossfade, cut) - Preload next source in background - Handle auto-refresh pages (like IRIS) **Error handling:** - Detect load failures - Skip broken sources - Fallback to static image - Log errors for debugging ### 6. Output Pipeline **Streaming options:** - **Spout** (Windows): GPU-to-GPU texture sharing - **NDI** (cross-platform): Network video - **WebRTC**: Browser-to-browser streaming - **Direct canvas**: If MadMapper can ingest from browser **Integration with MadMapper:** - Spout.js library for browser → Spout - Or: Electron app with native Spout bindings - Or: WebSocket control of external Spout source --- ## Implementation Phases ### Phase 1: Proof of Concept (Single Site) - Load IRIS seismic monitor in fullscreen browser - Detect Leaflet map object - Inject CSS to hide UI - Set zoom and bounds - Implement WebGL shader transform - Render to canvas - Validate output looks correct on flat screen ### Phase 2: Multi-Site Playlist - Create JSON playlist structure - Build playlist loader/parser - Implement iframe management - Add transition effects - Test with 3-4 different map types ### Phase 3: Library Detection - Build auto-detection for Leaflet, OpenLayers, MapLibre - Create manipulation API wrappers per library - Handle edge cases (Cesium 3D, D3 custom projections) - Fallback strategies ### Phase 4: Output Integration - Test Spout.js or Electron+Spout - Validate MadMapper can receive stream - Calibrate for actual sphere geometry - Performance optimization ### Phase 5: Production Hardening - Error recovery and logging - Health monitoring - Performance metrics - Control interface (web dashboard) - Documentation --- ## Technical Considerations ### Map Library Edge Cases **Leaflet:** - Easy API access via `L.map._instances` - Common controls: `map.setView()`, `map.setMaxBounds()` - May need to disable `worldCopyJump` **OpenLayers:** - Get map from DOM: `element.__map` - View control: `map.getView().fit(extent)` - Projection handling more complex **MapLibre/Mapbox:** - Access via global instance or DOM property - Vector tiles may need style modifications - Camera controls: `setCenter()`, `setZoom()`, `setBearing()` **Cesium:** - 3D globe, not 2D map - May want orthographic camera view - Different shader approach needed **Nullschool (D3):** - Already provides equirectangular option - May not need shader transform - Just hide UI and use directly ### Projection Transforms **Common cases:** - **EPSG:3857 (Web Mercator) → EPSG:4326 (Equirectangular)**: Use Andreas's shader - **EPSG:4326 → EPSG:4326**: Pass through, no transform - **Custom projections**: Need projection-specific shaders **Shader math:** - Inverse Mercator: `lat = atan(sinh(y))` - Forward Mercator: `y = ln(tan(π/4 + lat/2))` - Longitude: no transformation (both cylindrical) ### Performance Optimization **Resolution:** - Source capture: 2048×2048 to 4096×4096 (power of 2) - Output: 2048×1024 equirectangular (2:1 ratio) - Balance quality and frame rate **GPU efficiency:** - Single-pass shader preferred - Minimize texture reads - Use mipmaps for antialiasing - Target 60fps if possible **Memory:** - Reuse WebGL contexts - Dispose of old textures - Limit number of preloaded sources ### Boundary Handling **World wrap:** - Detect if map repeats at ±180° - Force bounds to prevent wraparound - May need to stitch tiles at antimeridian **Polar regions:** - Web Mercator undefined beyond ~85° latitude - Shader should clamp or fill with solid color - Some maps may have Arctic/Antarctic tile gaps ### Timing & Synchronization **Page load:** - Wait for DOM ready - Wait for map library initialization - Wait for initial tile load - Configurable timeout per source **Auto-updating content:** - IRIS refreshes every 4200 seconds - Don't switch scenes during update - Monitor for refresh events **Transitions:** - Crossfade requires dual-render pipeline - Or: simple cut between sources - Preload next source to minimize gap --- ## Open Questions 1. **Spout from browser**: Can Spout.js work reliably, or need Electron/native? 2. **Iframe and window options**: Security/CORS implications for map manipulation? 3. **Shader repository**: Store shaders in JSON? Separate files? Runtime compilation? 4. **Control interface**: Web UI for playlist editing? Or just JSON file? 5. **Multiple projections**: How many custom shaders needed beyond Mercator? 6. **Fallback content**: Static image? Black screen? Logo/branding? 7. **Monitoring**: How to detect if map failed to load properly? 8. **Globe calibration**: Does MadMapper handle all the sphere geometry, or shader adjustments needed? --- ## Resources & References - **Andreas Plesch's shader**: HLSL version for OBS (convert to GLSL for WebGL) - **Nullschool source**: https://github.com/cambecc/earth - **Map libraries**: - Leaflet: https://leafletjs.com/ - OpenLayers: https://openlayers.org/ - MapLibre: https://maplibre.org/ - **Spout.js**: GPU texture sharing for browser - **MadMapper**: https://madmapper.com/ --- ## Notes - This is a living document - update as implementation progresses - Current preference: browser-native approach complements (not replaces) Joshua's OBS workflow - **Target installations**: Harvard Science Spheres at: - Cabot Library - Natural History Museum - Graduate School of Design - Other campus locations