Bridging Distances — A Platform Designed to Make Shared Space Feel Real
Hero — VR headset + mobile mockup side by side, full bleed
Video calls solved communication at a distance. They didn't solve presence.
During COVID-19, over 36% of Americans reported serious loneliness — including people who were technically connected through video and social platforms. What was missing wasn't conversation. It was the feeling of being somewhere together. Host was designed to close that gap: let users scan their physical spaces and share them, so that being far away doesn't mean being in separate worlds.
15 interviews and 200 surveys — and the finding that ambient closeness matters more than communication frequency.
85% of survey respondents expressed interest in a tool that allows virtual co-presence. But the interviews surfaced something more specific: people didn't want more ways to talk. They wanted to feel like they were in the same room, doing nothing in particular, together. Three behavior patterns emerged:
- 01
Ambient closeness. Users craved presence without the pressure to perform or converse.
- 02
Accessibility. Existing AR/VR tools felt ‘clunky’ and ‘overwhelming’ — a hard requirement, not a nice-to-have.
- 03
Control. People wanted meaningful control over what they shared and when — privacy wasn't a barrier to adoption, it was a prerequisite.
User personas — three distinct profiles representing the emotional, technical, and social needs driving product expectations
Two surfaces, three iteration rounds, and a central question: what does presence actually feel like in an interface?
The project moved through user journeys and low-fidelity wireframes across three structured iterations before reaching high fidelity. The design questions that drove each round: How should scanning feel in AR? What does exploring someone else's physical space look like in VR? Is the interface intuitive on first contact, without instruction?
We used Figma for the mobile surface and Unity for VR interaction prototypes — two very different tools that forced clarity about where the experiences diverged and where they needed to feel unified.
User journey maps — two key flows showing how a user goes from scanning their space to sharing it
Low-fidelity wireframes — mobile scan flow and VR exploration interface side by side
Mobile for scanning and sharing. VR for exploring. One seamless experience connecting them.
The mobile companion handled the entry point: scan a physical environment using your phone camera, publish it to a discovery feed, or share it directly with specific people. The discovery surface let users browse publicly shared spaces — from living rooms to parks to museum galleries — with social signals like trending scans making solo use feel less isolating.
The VR experience delivered the destination. The "Send to VR" flow made the transition from browsing on a phone to exploring in a headset effortless — one tap, then you're there. The final system connected every flow: onboarding → exploration → scan uploads → social feedback loops. Each piece reinforced the next.
Mobile high-fidelity screens — scan flow, discovery feed, space detail view with Send to VR button
VR interface — homescreen showing Popular Scans, Recently Downloaded, and spatial navigation controls
The concept contributed to a state of Texas grant awarded to the Soft Interaction Labs at Texas A&M.
Host was developed as part of a broader VR/AR healthcare and social research initiative. The work contributed to a grant application that was successfully funded by the state of Texas — validating both the research direction and the quality of the prototype as a proof of concept.
The biggest lesson wasn't about VR. It was about assumption.
My group and I repeatedly had to be pulled back from building features we believed users wanted before we'd verified it. The experience of discovering — mid-sprint, after investing real time — that a feature wasn't needed or wasn't feasible was the most valuable thing the project taught me. The professor's reminder to never forget who you're designing for sounds obvious until you watch yourself ignore it.
Working across mobile and VR simultaneously taught me a different kind of discipline: designing each surface to be complete on its own, so the connection between them feels effortless rather than necessary. That principle shows up in every multi-surface project I've worked on since.