Recording
Summary
Planning session for a faculty demo of the LED sphere display at Harvard's Cabot Library ("Discovery Bar"). The event targets approximately April 13 at noon, with a 20-30 minute structured presentation followed by freeform exploration and Q&A. The audience will be Harvard science faculty, with food provided.
Event Format
- 20-30 minutes of lightning demos (a few minutes each)
- Freeform Q&A and hands-on exploration afterward
- Two spheres active, each as a separate demo station during freeform time
- ~15-person audience with chairs, casual/open format
Individual Contributions
Josh Widdicombe -- TouchOSC iPad interface controlling Mad Mapper and Notch. Lunar/Mars satellite imagery with rotation controls (east-west rotation; north-south distorts too heavily). Also covering sphere technical overview (how it works, pixel pitch, projection).
Stephen Guerin -- Three focus areas: (1) equirectangular projection warping via WebGPU/WebGL for content from different formats, (2) user content submission and playlist management with authentication (Harvard email, curation workflow, URL-based submissions), (3) time-based event visualization with layers (hurricanes, pandemics, air traffic, weather archives). Also proposed: photo geolocation from laptops (no upload needed, runs locally), semantic research mapping on the sphere, and AI-assisted paper visualization (demoed converting Greg's PhD thesis into sphere visuals).
Greg Kestin -- Particle collider visualizations from his particle physics background (LHC collisions, light shell theory). Brainstormed interactive audience demos: wave propagation, tsunamis, pandemic spread, tectonic plates, species migration, lightning/thunder visualization. Emphasized keeping demos short and varied ("sprinkles of ideas") to inspire faculty.
Devon Bryant -- Cinema 4D virtual 360-camera workflow. Will demonstrate the dome camera setup (inward-pointing camera capturing all directions), the equirectangular output, and how it maps back onto the sphere. Plans to use the Discovery Bar's interior screen alongside the sphere.
Interactive / Audience Participation Ideas
- Phone-based interaction via QR code (like SimTable): tap to create waves, trigger events
- Wave propagation on the globe (ocean waves, sound, tsunamis)
- Collaborative agent simulations where each audience member controls an agent
- Time slider for historical events (air traffic changes, weather, pandemic spread)
- Speed-of-light and speed-of-sound visualizations on the globe
Technical Specs
- Content format: 1920x960 equirectangular (2:1 ratio) preferred; 1920x1080 also works (Mad Mapper scales)
- Setup: 4 monitors off one machine (2 sphere, 1 control, 1 content capture via NDI)
- Alternative: laptop with capture card for contributor-controlled demos
- Interactive content: separate UI window captured alongside content window
Faculty Outreach
- John Shaw's priority: engage researchers across disciplines
- Target departments: climate science, computer science, EPS, astronomy, exoplanets
- Martin Wattenberg (Harvard CS, formerly Google) as potential collaborator
- Goal: faculty submit their own content/research for the sphere
Key Quotes
"If you go on Claude Code and say, make me this applet... you can just drop it on here. They will love that." -- Greg
"It's a talking dog demo. You don't care what the dog says, the fact that it's talking is interesting." -- Stephen, on AI paper visualization
"Think of the whole portfolio that John manages as research... you could lay it out in equirectangular on the sphere." -- Stephen
Participants
Recording & Resources
- Video Zoom Cloud Recording
- Transcript VTT Transcript
- Summary AI Meeting Summary (.md)
- Chat Zoom Chat Log
Chat Messages
Action Items
- Greg -- Create Google Drive folder "April Sphere LED Lightning Demo in Cabot" and invite everyone (Harvard emails). Share brainstorming doc and idea list. (Done during meeting)
- All -- By end of next week (~April 3): each pick at least one "showstopper" demo idea and document it in the shared Google Doc.
- All -- Internal demo day ~April 6 (Monday): show MVPs to each other, iterate on feedback. Stephen joins via Zoom.
- Josh -- Lock down the date for the faculty event (target April 13 at noon, fallback April 20). Confirm with Susan/Cabot Library.
- Josh -- Prepare technical setup: 4 monitors (2 sphere, 1 control, 1 content capture), capture card for laptop input, Mad Mapper mappings.
- Josh -- Build TouchOSC-to-Mad Mapper interface demo with Lunar Reconnaissance Orbiter and Mars satellite imagery datasets.
- Stephen -- Work on equirectangular projection warping (WebGPU/WebGL), user content submission workflow, and a time-based event layer demo (e.g., hurricane, pandemic spread, air traffic).
- Stephen -- Reach out to Martin Wattenberg (Harvard CS) about potential collaboration.
- Stephen -- Coordinate with Greg on the applet-creation workflow (Claude Code to sphere pipeline).
- Greg -- Explore particle physics visualizations (LHC collisions, light shell theory from his PhD thesis) for sphere content.
- Greg -- Reach out to Harvard faculty (exoplanets, climate science, EPS, astro, CS) for research collaboration ideas. CC the team on any outreach to avoid overlap.
- Devon -- Prepare 5-10 min demo of his C4D virtual 360-camera workflow for creating sphere content, showing the dome camera setup and equirectangular rendering pipeline.
- Stephen -- Keep April 13 open (9 AM - 2 PM) until date is confirmed.
- All -- Mid-next-week: schedule a follow-up call via email.