A Look into Theatre’s FutVRe by Carly B. Johnson
This piece, A Look into Theatre’s FutVRe by Carly B. Johnson, was originally published on HowlRound Theatre Commons (howlround.com/look-theatres-futvre), on April 18, 2019. Johnson explores the possibilities of burgeoning VR and AR technology to expand upon and enhance theatrical experiences.
Virtual reality (VR) and augmented reality (AR) technology is invading the theatre world.
VR and AR are two different monsters. VR hardware is bulky—think HTC Vive or Oculus Rift. The headgear is attached to wires, and the system requires a complex camera setup to track the user moving through the room. VR also utilizes controllers to help the user interact with their environment. AR hardware is physically much smaller—think Google Glass. While the ultimate goal of AR is everyday use to augment our reality, the goal of a VR experience is to create an entirely new one.
Creators at Carnegie Mellon University (CMU) in Pittsburgh, where I study, produced two exciting new projects during the 2018–19 school year—one using VR technology, and one using AR technology—that expand upon and enhance theatrical experiences.
The interactive learning tool Shakespeare-VR, co-developed with the American Shakespeare Center, Stitchbridge, and CMU’s dSHARP lab, uses VR to modernize and enhance high school–level Shakespeare education. The creators of the project realized that Shakespeare education traditionally lacks theatricality: students have to slog through the weighty texts without seeing the plays realized as they were meant to be—on a stage. Shakespeare-VR gives them that experience.
Stephen Wittek, CMU Shakespeare professor and director of the project, knows that VR technology is the perfect way to modernize the Bard because VR has the capacity to facilitate an active learning experience. The project is founded on the “learning by doing” philosophy. “In fields outside Shakespeare studies,” he says, “researchers have begun to establish a compelling evidentiary basis for the educative potential of active learning activities situated within virtual environments.”
When students—the intended users—strap on the VR headset, grab their controllers, and fire up this full VR experience, they find themselves seated amongst audience members at the American Shakespeare Center’s Blackfriars Playhouse in Staunton, Virginia. The users are treated to performances by real actors, dressed in full costume, delivering soliloquies to a responsive audience that laughs, claps, and exclaims.
Shakespeare education traditionally lacks theatricality: students have to slog through the weighty texts without seeing the plays realized as they were meant to be—on a stage. Shakespeare-VR gives them that experience.
But the students aren’t confined to their seats. They can use their controllers to point, click, and be transported to a new space in the theatre for a new viewing angle. Not only do they get to enjoy a performance (a recording of a real actor performing onstage), they can understand spatially what a thrust stage is, where the trapdoors are, and what it feels like to stand backstage watching the audience. They can see the performances from Lords and Ladies’ seats up high, or from the standing-room-only space in the pit. The students can even get on stage with the performers.
This project immerses students in a theatrical experience, making the language of Shakespeare more accessible and interactive. On top of that, it plays a huge role in exposing students to high-quality theatrical experiences at a younger age. The hope is that this will deepen their appreciation for performance arts and even maybe encourage them to pursue theatrical careers. The project’s creators hope that after experiencing Shakespeare-VR, not only will there be fewer complaints from students about having to sift through difficult text, but more of them will feel compelled to venture into a real theatre to see how reality and virtual reality compare.
Around the same time that Shakespeare-VR was being developed, TheatAR—a group made up of CMU graduate students—created Project Neverland, a staging of the opening scene of Peter Pan that uses AR technology as a way to enhance the production.
TheatAR’s project is full of thrilling promise for AR-enhanced theatre that expands what can be shown on a stage. The project’s creative director, Dan Wolpow, says that when he first heard about AR technology, he wondered what it would be like to combine real-life actors and animated characters on stage. The team noted that historically, in Peter Pan, “Tinker Bell is portrayed by a small pinpoint spotlight that shakes around as she ‘speaks,’ and flutters along the stage on a two-dimensional axis.” So, they decided to make Tinker Bell an actor herself using AR.
Creating a VR or AR experience requires a 360-degree camera to record and project a 2D object in a way that feels 3D. The TheatAR team also needed animators, coders, and character designers to help bring their virtual performer to life. The project even presented the real performers with the unique opportunity to learn how to act with a scene partner they couldn’t see.
Every audience member was also viewing the show wearing AR glasses, which allowed them to see Tinker Bell as well. Tinker Bell, though not a live performer, flew around the space in three dimensions, spoke with articulated facial movements, and interacted with the performers.
In the final iteration of this transmedia project, audience members sat in traditional theatre seats and watched performers act in real time. However, every audience member was also viewing the show wearing AR glasses, which allowed them to see Tinker Bell as well. Tinker Bell, though not a live performer, flew around the space in three dimensions, spoke with articulated facial movements, and interacted with the performers. Audiences said that when the performer made eye contact with the animated Tinker Bell, the feeling was “magical.”
The project was a learning experience for all involved, given that working with the practical realities of AR technology require new and inventive ways of lighting and designing a set. Not only did the real-life elements have to interact with and react to Tinker Bell as if she were real, they also had to integrate with the AR hardware, which required space for the cameras, enough headsets for all audience members, and a smaller scene space to limit where Tinker Bell could go.
While Shakespeare-VR created a deeper learning experience for students by transporting them to an explorable Shakespeare performance, Project Neverland enhanced a production experience by fully realizing a character that audiences have never gotten to see on stage in such a real way. These projects are only two examples of the new ways in which theatre and VR/AR technology can take advantage of the assets they each have to offer. The technology is still in its very early stages of development, which means the possibilities are also just beginning. The passion and promise in these two projects alone, though, is enough to demonstrate that the intersection of VR/AR and theatre is one worth exploring, and that the products of these collaborations are truly magical.