Using Mixed Reality to Support Professional Networking Events

How might we encourage people to make more connections at professional networking events with augmented reality?


At professional events, where meeting people to establish valuable connections is key, having a conversation that's relevant, efficient, and fun is not always easy.


Our team developed a functional prototype on the Microsoft HoloLens that allows users to identify relevant people to start a conversation with by surfacing metadata on users at the moment while not invading normal conversational behaviors.


Aaron Faucher

Ravi Morbia

Frank Teng


4 months


Microsoft HoloLens


Unity, Sketch, After Effects

What did I do

My work focused on designing an augmented reality experience, developing a working prototype on the HoloLens, and producing the concept video.


"Networking events are awkward..."

What does it mean to encourage people to make more connections at professional networking events? We defined our problem a bit more and started out with the following questions:

  • How might we make it easy to identify people with common interests?

  • How might we reduce awkwardness when approaching strangers?

  • How might we keep the conversation relevant and engaging?


AR might be a smarter solution

We first used both fly-on-the-wall and participatory observation methods to expose ourselves to the needs and motivations inherent in the networking context.

Main opportunities

Points of Connection


A participant wearing a University of Pittsburgh sweatshirt was approached by an individual asking more specific questions about the university.

People are searching for points of connections when starting conversations. A solution that reveals points of connection could be useful. 

A participant who was attempting to learn a new coding language. The facilitator connected him with another member who he knew had experience in this language. 

People rely on information sources such as a facilitator to connect with others. A solution that offers this information could help establish connections.


Current solutions have limitations

There already exist a number of mobile apps for use in conferences that help users connect with other participants. However, these apps require users to hold a device and compare the real world with information on the apps while using. Such behavior is unnatural, awkward, and invasive in normal conversational interactions.


Using generative sprints to come up with initial ideas

Generative methods included Google Ventures’ Crazy Eights format, ‘Yes And’ improv, and Round Robin.


Here are ideas we decided to move on with: 

Body storming

Using post-it notes to simulate the experience of wearing an AR device

We ran several bodystorming sessions to quickly test our ideas. They exposed fundamental questions about integrating AR into a professional networking event.


When, where, how to display what kind of data?

We prototyped out previously generated ideas with paper or low-fidelity HoloLens apps and ran a few user testing sessions to gain insights on the challenge above.

Insight 01

Information provided during conversation is intrusive.

We first tested an AR system that pushes information to users.

Contents showing up themselves.

During conversation, information prompt shows up.

The system provides two types of information.

Version 1: Additional information related to the conversation.

Version 2: Topics to talk about with the conversation partner.

User Feedback

  • Users felt interrupted by the appearance of the AR overlay when information prompts popped up.

  • When given conversation prompts, users also felt obliged to follow the prompts. The conversation became too directed.

We also tried an AR system that allows users to pull information.

Users bring up contents manually.

User starts to interact with the inforamtion board while talking.

User Feedback

In a passive system, the user described feeling snubbed when the other user started interacting with the device more than talking with him. 

Design implication

Provide enough information before conversations to prepare users instead of prompting them during conversations.

Insight 02

People want to know what is displayed about them.

Users cannot see what information is displayed.

A private AR system provides information as the user engage in talking.

User Feedback

Users felt uncomfortable when they didn’t know what information was hovering above their heads. In fact, they kept looking back to the paper board to check which was disruptive to conversations.

Design implication

Give each user a private panel to check the information displayed.

Insight 03

People are fine sharing personal data that is already online

We used storyboards to speed-date scenarios to provoke reactions and explore users' comfort zone.

An extreme example was "A stranger comes up to you and says, 'Wow, those hydrangeas outside your front porch are my favorite kind of flowers!”

Users are comfortable with others knowing the information they’ve already shared online, such as Linkedin profiles.

Design implication

Display information sourced from data users already publicly shared.

Insight 04

Information about goal/interest/skill helps users prioritize whom to talk to.

​We wanted to know what kind of information provides most value. A few user tests revealed that:

  1. Info about goal/interest/skill helped users prioritize who to talk to.

  2. Users wanted to know names and job titles.

  3. Info about locations or companies confused users.

Design implication

  1. Highlight goal/interest/skill.

  2. Display goal/interest/skill in a form differently than names, job titles, and affiliation.


A tagging system to highlight information

What is the "tagging system"?

01. Enter tags

02. Display tags from all users

03. Filter tags for highlighting

Design the user interface

We designed a user control panel that:

  1. allows users to filter tags

  2. gives users an overview of their displayed information

Created by Ravi and Aaron.


Develop a functional prototype

Dynamic app data via Google Forms-Unity integration

We used Google Form for its user-friendly interface, plus the fact that the spreadsheet could be easily formatted as JSON. 

User inputs information and tags via Google Forms.

Google Forms outputs to a Google Sheet.

Unity uses REST to access sheet data, parses JSON, and integrates user data into AR scene.

Tag rendering & filtering functionality

Local user's default view with all matching tags highlighted.

Local user highlights tags that are relevant to his.


How effective is the tagging system?

Due to resource constraints, we could not test the envisioned networking event where everyone wore AR-enabled headsets. In lieu of this, we focused on how a user might prioritize potential conversation partners in a room of three people. We recruited 5 participants from various industry backgrounds. They were all Master’s students between the ages of 25-35.


A real-time view recorded by HoloLens.

press to zoom

Participant conducting a Think-Aloud study.

press to zoom

Debriefing the study with a participant.

press to zoom

A real-time view recorded by HoloLens.

press to zoom

Results from user research

1. AR helped users prioritize who they want to meet.

2. Job titles and affiliation trumped matching interests.

3. AR created an opening for a conversation.

4. Users were comfortable sharing interests.



Our study showed that AR influenced the conversations users prioritized during networking meetups. Our participants reflected that in past networking events, they chose whom to talk to based on familiarity, proximity, availability, and superficial homophily (same gender, age, affiliation), instead of their goals at the event.