Trent Holms Petersen
Trent Holms Petersen

Product Design Leader | Leading AI-empowered designers, Building Immersive, Intelligent Applications

Explorations in Spatial Computing & AR

A self-directed research and development sprint to explore the foundations of spatial (XR) design.

My Role: Independent Product Producer & Designer

Core Outcome: A conceptual framework for a spatial design system and a functional WebXR prototype ("Ghost House AR") to test its principles.

My Reflections (up front)

This self-driven project reinforces my "Product Producer" mindset. The lines between design, strategy, and development are blurring, and this is especially true in XR.

To lead in this new world, you must be technically curious, hands-on, and willing to build. This project demonstrates my personal passion and my capability to lead teams into this new, immersive frontier—not just by managing, but by understanding the technology from the ground up.

Spatial computing isn't just the next platform shift; it's a fundamental reimagining of how humans interact with information and each other. By building and testing now, I'm positioning myself—and any team I lead—to be at the forefront of this transformation.

Spatial Computing and AR Design

The "Why": Preparing for the Next Platform Shift

As a product leader passionate about the future of interaction, I'm not just waiting for the next platform; I'm actively prototyping for it.

The rise of spatial computing (AR/VR/MR) represents the most significant shift in human-computer interaction since the multi-touch screen. This shift breaks nearly all our 2D assumptions about design.

This project was a self-driven exploration to answer two core questions:

  • The System: How do we design scalable, intuitive systems for 3D space?
  • The Application: How can we build and test these new concepts in a rapid, accessible way?

This work was broken into two parts: a conceptual framework and a functional application.


Part 1: The System – A Conceptual Framework for Spatial Design

The Problem

How do proven 2D UX principles—like information hierarchy, affordance, and accessibility—translate to a 3D, volumetric environment where the user can move?

My Exploration

I dove into spatial HCI (Human-Computer Interaction) research, deconstructing existing platforms to understand the new "first principles." My research focused on answering fundamental questions:

Affordance: What is a "button" when it has no screen? How do users know they can interact with a 3D object using gaze, gesture, or voice?

Information Architecture: How does IA work when a user can physically walk around it? How do we guide users through a flow?

Ergonomics & Comfort: What are the rules for legibility, field-of-view, and occlusion (when one object blocks another)? How do you prevent user fatigue?

Spatial Audio: How do sound cues enhance (or distract from) spatial navigation and understanding?

Gesture & Interaction: What gestures feel natural in 3D space? How do we design for multiple input methods—gaze, hand tracking, voice, controllers?

The Outcome

I developed a basic conceptual framework for a WebXR spatial design system. It's still in its infancy, but it establishes foundational principles for designing intuitive, accessible, and ergonomic spatial experiences.


Part 2: The Application – "Ghost House AR": A WebXR Testbed

The Problem

A conceptual framework is a start, but theories are useless without testing. I needed a simple, functional application to test these new spatial concepts in a real-world environment.

My Execution: Building "Ghost House AR"

I built "Ghost House AR," a simple web-based augmented reality game. I chose WebXR for its accessibility: it requires no app store, no download, and runs directly in a mobile browser. This makes it the perfect platform for rapid prototyping and user testing.

I handled the end-to-end creation, from 3D modeling and asset creation to development (using A-Frame and three.js). The goal was to create a simple testbed for my spatial system's concepts:

  • How close does an interactive object (a "ghost") need to be to feel reachable?
  • How do users react to spatial audio cues?
  • What is the most intuitive way to "collect" or interact with a 3D object?
  • How does real-world scale affect user understanding and engagement?
  • What level of visual fidelity is necessary for effective spatial interactions?

The Technical Stack

  • WebXR API for cross-platform AR support
  • A-Frame and Three.js for 3D rendering
  • Web Audio API for spatial audio implementation
  • JavaScript for interaction logic and state management
  • Responsive design for mobile-first accessibility

The Outcome

A tangible, working AR prototype. This app, while simple, was a critical testbed. It proved that I can not only theorize about spatial design but also execute and build functional prototypes.


Key Learnings & Principles Validated

Through building and prototyping, I validated several key principles for spatial design:

  1. Proximity & Reach: Users intuitively understand spatial interaction when objects are within a natural reaching distance.

  2. Spatial Audio Cues: Sound is a powerful tool for directing attention and creating immersive experiences in 3D space.

  3. Gesture Affordance: Subtle visual cues (glows, scale shifts) communicate interactivity better than text in spatial environments.

  4. Cognitive Load: Limiting simultaneous interactive elements prevents overwhelm in 3D space.

  5. Accessibility-First Design: Spatial experiences must support multiple input methods and account for diverse physical abilities.