The body in augmented spaces; sensitive spaces as AV instruments

The workshop offers a hands-on introduction to multidisciplinary techniques and interaction tools which allow to physically control AV technologies in real-time. These innovative techniques are currently being used in fields such as the arts, performance or music.

We will examine available platforms and resources, their creative and stage possibilities, areas of implementation, technical limitations and expressive efficiencies.

This intro workshop is aimed at creatives, artists in a wide sense, experts and technicians from different fields, interested in incorporating motion-capture and analysis technologies into creative and theatrical projects.

Structure:

We will begin by looking at the philosophy and evolution of interaction, through a visual tour of some key examples of its use in the history of art. This first part will offer an introduction to HCI (Human-Computer Interaction) and the theory of interaction.

For the second part we will set up some of the tools and platforms examined, offering participants the chance to experiment with them in a practical way.

Goals:

Participants will be encouraged to develop their own interactive / AV project based on the acquired knowledge, with the incorporation of technologies such as:

    • Computer vision

    • Sensors

    • 3D scanning

    • Biometric and motion-analysis software

    • Image and audio, control and synthesis software

Systems available for testing: Computer Vision, Kinect, Leap Motion and Arduino

Aimed at:

    • Performers, dancers

    • Programmers and event technicians

    • Visual artists, animators, AV producers

    • Dramatists, set designers

    • Musicians and DJs

Course leaders: Caen Botto // Marta Rupérez

(bios)


NO PREVIOUS KNOWLEDGE OR EXPERIENCE NECESSARY

YOU DON'T NEED TO BRING A COMPUTER



Friday December 4th

17 to 21:30


Espai Cultural

Can Ventosa, Aula 3

Ignasi Wallis 26

(entrada x Pere Francés)


35€

Limited to 10 people


Info & registration:

nourathar.m@gmail.com