Spirited Ruins is a distributed, immersive environment which showcases visual and auditory works created by invited artists. It uses high-bandwidth, low-latency networking to interconnect ImmersaDesks, CAVEs, interactive kinetic sculptures, workstations, and web clients to create a shared virtual world. It features local animation, globally synchronized events and event multi-casting, 3D sound localization, telephony, and distributed audio streams.
The virtual tourist explores the ruins of a palace built by an ancient civilization which believed that its creations would spontaneously come to life by virtue of their particular form and proportions. Through a confluence of magic and technology, it is possible for participants in the virtual world to bring to life these long-dormant objects. For instance, gallery visitors who interact with kinetic sculptures may appear as gyrating palm trees, and web participants may appear as swarms of bees. This concept of "translated interaction modalities" is a key component of Spirited Ruins. It serves as a metaphor for the evolving electronic culture; it blurs the boundaries among human beings and electronic agents; it spurs the creation of new modes of computer-mediated human interaction.
Long ago, a mysterious culture devoted itself to creating
art that would come to life. They believed that by producing objects and
environments in just the right form and proportion, life would spontaneously
enter their creations. The palace they set out to build used forms from
nature such as sea shells and beehives, combined with what they considered the
most perfect geometric forms. Walking through the palace one might pass from
a majestic gorge to the inside of a huge anthill to a courtyard
of pillars and fountains. Throughout this landscape are objects, statues,
mobiles, paintings, tapestries, and furniture, all created to summon
the force of life.
For unknown reasons, the culture disappeared. Their unfinished palace is
a ruin, the sculpture within it in various stages of decay. There are
sections virtually unchanged, others nothing more than dusty outlines.
In a rare conjunction of time, magic, and technology, we are now able
to inhabit the various elements of the palace through virtual reality.
What a visitor finds is a marvel of natural and geometric architectural shapes,
grown over with luscious green and varicolored plant life, enlivened by
butterflies and the calls of hidden birds. Entering various courtyards,
amphitheaters, buildings, and galleries, the visitor finds objects and
sculptures which respond to their
proximity and gestures. Some seem to have a life
of their own, as if the original hopes of their creators had been realized.
Familiar and unfamiliar creatures flit about; some objects move, change
size, chase each other, talk or sing.
Ideally, this environment will be run on several ImmersaDesks
and, if at all possible, in a CAVE.
Additionally, there will be other "satellite interaction stations" at various
locations which connect to the virtual world. These make use of
alternative input and output devices, which allow the visitor to manipulate
(physical) sculptural pieces and have the manipulations translated
into actions in the virtual world.
Similarly, actions in the virtual world are translated
by the satellite stations as graphical displays, moving sculpture, and/or
audio output.
In a CAVE or at an ImmersaDesk, a 'pilot' navigates through
virtual rooms using a tracked,
hand-held input device. Pilots can interact with objects using a virtual
light-sword connected to this input device. In the simplest case,
pressing a button triggers an object's pre-programmed behavior. Some
objects are connected to autonomous agents, giving them more complex
behaviors. In response to the participants'
actions, or based on the agents'
own internal heuristics, objects may move, change form,
produce sonic variations, etc. All pilots in the networked world are
represented by avatars, so they may interact with each other by
navigating in the space and by talking via the system's integrated telephony.
Interacting with the satellite stations is a different experience
from being immersed in the virtual world. For instance, a person at such a
station might blow into a tube and watch the effect that varying the breath
has on a lighted panel display. In the immersive world, the effect of blowing
might be the swinging of a pendulum. A visitor might pick up and play with
a furry, flexible, abstract object. The movement
generated by this tactile interaction might be translated into the
sound of a thunderstorm or the motion of an amusement park ride
in the virtual world.
A key component of Spirited Ruins is the concept of translated interaction
modalities. The representations of participants may well be direct and
faithful but, by the very nature of virtual reality, these representations
could just as well be exaggerated, distorted, or misleading. A participant
does not necessarily know whether the behaviors he or
she observes come from
immersed participants, participants at satellite stations, web surfers, or
autonomous agents. Participants interacting with each other and the
environment may each be presented with very different representations.
This idea of translated modalities has several implications. First,
it is a metaphor for the current and future interactions presented by
the evolving electronic culture. Second, it blurs the distinction between
communicating with another human being and an electronic agent. Third,
it provides an exciting testbed for new input and output methodologies.
We will welcome institutions who have facilities which can support the
required graphics who would like to join in from remote sites.
A major emphasis of this project is the integration of work
by a diverse range of contributors and participation over a geographically
wide area.
We expect many attendees to be drawn to this installation. Spirited Ruins
is a collaboration among artists and software developers in a
cutting-edge computer graphics environment. The immersive experience is
rich in visual complexity, participant interaction, and ambient activity.
The navigation, 3D localized sound, globally synchronized event streams
and event multi-casting are each new, innovative
methodologies. The concept of translated interaction modalities is novel
and timely.
Through our High Performance Computing in the Arts (HiPArt) project, Boston
University's Scientific Computing and Visualization Group (SCV) is
committed to fostering a collaboration between software developers and
computer artists. A particular emphasis is placed on making very high
performance computing and graphics resources available to the art
community. The members of SCV have made it possible for artists who have
created work in their own computing environments to bring their models and
music into a virtual reality environment. We have had very enthusiastic
responses from these artists
when their work is experienced in the context of work by other artists,
in the high performance graphics environment we offer.
Several projects have preceded Spirited Ruins. The first was a group
project in the Boston University
School for the Arts 3D Design class taught
by Laura Giannitrapani, Manager of Graphics Consulting in SCV. This project
was a virtual feast, in which each student contributed an animated element.
This stand-alone program,
which was among the first to be run on our ImmersaDesk,
allowed the pilot to move around the virtual dinner table and
activate the food items by looking at them.
The second project, called ArtWorld, was based on the theme of a Sixteenth
Century Martian Toy Store, and contained animated virtual toys contributed
by artists working independently and
artists from the Boston Museum School, Boston University,
Massachusetts College of Art, the Rhode Island School of Design, and
Springfield College.
This is a distributed application in which multiple
participants, represented by avatars, navigate through a virtual space
composed of five distinct locales, triggering the toys' motion and sounds
by using virtual light swords. ArtWorld has been shown in large
venues a number of times during the last year, including the Alliance 98
Meeting in Champaign-Urbana, the High Performance Distributed Computing 98
Conference in Chicago, the Supercomputing 98 conference in Orlando, and
the Internet2 meeting in San Francisco. It has also been shown many times
to smaller groups, including an opening for the participating artists.
In each of these showings, participants from several locations across the
country have interacted in this virtual world over a very high speed network.
Participants have enjoyed navigating through the world and interacting with
the artworks and with each other, often finding ways to use the
environment that its creators did not imagine.
The hardware used for the primary interactive display consists of a Pyramid
Systems CAVE or ImmersaDesk (1-4 video projectors and tracking hardware),
SGI Onyx2 with at least one gigabyte of memory,
and an audio subsystem (2 SGI Indys with at least 128 megabytes of memory,
4 powered speakers, 1
powered subwoofer and 1 audio mixer). The software used for the primary
interactive display consists of SGI IRIX, CAVE library, SGI Performer, and
audio, graphics, and networking code developed at Boston University.
The hardware used for the satellite installations includes SGI O2
workstations, MIDI I/O interfaces (e.g., Pavo MIDItools Computer), and
interactive sculptures created by participating artists. The software used
for the satellite installations consists of SGI IRIX and audio, graphics,
and networking code developed at Boston University.
This application requires a high-bandwidth, low-latency network; a
bandwidth of at least two megabits per second and network latencies of no
more than 100 milliseconds are required for best performance.
These projects are built on top of the DAFFIE (Distributed Architectural
Framework For Immersive Environments) software developed by members of
SCV, including Glenn Bresnahan, Director; Erik Brisson, Manager of
Graphics Programming; Robert Putnam, Audio Programmer;
and Kathleen Curry, Graphics Programmer.
Copyright 1999, Erik Brisson and the Trustees of Boston University. The Story
The Experience
Background
Implementation
Related Links