NYU CAT NYU CAT NYU CAT
NYU CAT
NYU CAT *
Technologies and Projects - Projects - The Butterfly (VIP)
NYU CAT
NYU CAT
*
  News
  Information
  Technologies
and Projects
  Research
Collaborators
  Industry
Partners
  People
  Contact
  Home
  *
NYU CAT
* butterfly
The Butterfly
(Virtual Interactive Puppetry)

Creating virtual characters, a virtual stage, interactive 3D sound, and networked puppetry, the CAT seeks to redefine the nature of group interactions and live performance. In collaboration with colleagues from the Smartlab Centre in London, CAT researchers are developing tools to help people work and play with computer-generated objects in ways that freely and intuitively defy the traditional boundaries of space and scale. In a real-time, virtual collaborative environment people will "puppeteer" 3D avatars in a rich collaboration between the arts and computer science.

puppeteer The team is developing a creative space in which a puppeteer, working with responsive stereo vision and 3D sound, can perceive graphically generated puppets appearing to be within the puppeteer's hand/eye space. Using a combination of stereo glasses outfitted with trackable LEDs and wireless earphones, this system creates the illusion of a virtual puppet floating in 3D space. If the performance includes a live actor or dancer, stereo glasses will enable the puppeteer to see the combination of real and generated actors as if they were present in a single performance space. An audience in the theatre, looking at dancers and wearing passive stereo glasses, will be able experience the dancers interacting with life-size versions of computer-generated puppets. Our first version of the program gave the user control over the butterfly's behavior. Recently, we extended that control to include the butterfly's appearance

Applications
In addition to the Butterfly Project's performance augmentations, the CAT is seeking ways that this project can assist disabled persons in everyday activities and in therapy situations. Using virtual puppets as mobile extensions of themselves or their intentions, people with limited physical mobility will participate in various collaborative situations in new ways. Currently we are developing a therapy suite with customized feedback visualization, breath sensor control and programmable exercises. Thus a therapist can compose a series of exercises and communicate via internet with the patient who 'plays' with the butterfly by fulfilling various breathing exercises.

See Demo Here
To view a video overview of the project, please click here.

Principal Investigators
Ken Perlin
Courant Institute of Mathematical Sciences,
Kate Brehm, Joel Kollin, Daniel Kristjansson, and Jeremi Sudol.

*
* *
*
Email
For additional information, contact: info@cat.nyu.edu


*
*
*
Technologies
*
* *
*
*
Demos
*
*
 
*
*
Nystar
*
NYU CAT