Sensing and Tracking for 3D Narratives

This is an Archive of a Past Event

Great storytellers track the attention and responsiveness of their listeners, adapting the story – its context, plot, pace, voice and imagery – to acquire and hold audience engagement. Because they can and because they have, users now want to be in the story – included, immersed and interactive. Each person in the audience is increasingly in control of her own media experiences – selecting content, creating content, pausing at will – making her own stories. Media experiences are personalized, sharing is globalized, and sensory experiences are mediatized.

THIS EVENT IS FULL. Please email Jason Wilmot to be placed on a standy list is case of cancellations.
 


Tracking and sensing technologies open new opportunities to create stories in ways that not only include the audience but adapt to it. Augmented reality, virtual reality, machine learning, and data-driven algorithms open new opportunities for creating and experiencing narratives. Immersive experiences for business, art, education, and entertainment stories leverage insights from Stanford research to forge the partnerships between storytellers and the audiences. 



Join us on October 24th from 1pm-5pm in Tresidder Memorial Union Oak Lounge West to hear about the research behind these insights and glimpse the new horizons with Keynote talk by Jeremy Bailenson and Chris Chafe. You'll also hear additional research presentations and experience exhibits and demos.