<?xml version="1.0" encoding="utf-8"?>
<!-- generator="FeedCreator 1.7.2-ppt DokuWiki" -->
<?xml-stylesheet href="http://129.102.1.137/lib/exe/css.php?s=feed" type="text/css"?>
<rdf:RDF
    xmlns="http://purl.org/rss/1.0/"
    xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
    xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
    xmlns:dc="http://purl.org/dc/elements/1.1/">
    <channel rdf:about="http://129.102.1.137/feed.php">
        <title>Music Representations Team efficace:wp</title>
        <description></description>
        <link>http://129.102.1.137/</link>
        <image rdf:resource="http://129.102.1.137/ttp://129.102.1.137/lib/tpl/repmus/images/favicon.ico" />
       <dc:date>2026-04-16T17:07:40+02:00</dc:date>
        <items>
            <rdf:Seq>
                <rdf:li rdf:resource="http://129.102.1.137/efficace/wp/agger?rev=1585893964&amp;do=diff"/>
                <rdf:li rdf:resource="http://129.102.1.137/efficace/wp/analysis?rev=1490542413&amp;do=diff"/>
                <rdf:li rdf:resource="http://129.102.1.137/efficace/wp/heart-rate?rev=1473848011&amp;do=diff"/>
                <rdf:li rdf:resource="http://129.102.1.137/efficace/wp/musical-processes?rev=1463758601&amp;do=diff"/>
                <rdf:li rdf:resource="http://129.102.1.137/efficace/wp/pom?rev=1513769828&amp;do=diff"/>
                <rdf:li rdf:resource="http://129.102.1.137/efficace/wp/trajectoires?rev=1558944827&amp;do=diff"/>
            </rdf:Seq>
        </items>
    </channel>
    <image rdf:about="http://129.102.1.137/ttp://129.102.1.137/lib/tpl/repmus/images/favicon.ico">
        <title>Music Representations Team</title>
        <link>http://129.102.1.137/</link>
        <url>http://129.102.1.137/ttp://129.102.1.137/lib/tpl/repmus/images/favicon.ico</url>
    </image>
    <item rdf:about="http://129.102.1.137/efficace/wp/agger?rev=1585893964&amp;do=diff">
        <dc:format>text/html</dc:format>
        <dc:date>2020-04-03T08:06:04+02:00</dc:date>
        <title>efficace:wp:agger</title>
        <link>http://129.102.1.137/efficace/wp/agger?rev=1585893964&amp;do=diff</link>
        <description>Visualization, Control and Processing of Sounds as 3D Models

Savannah Agger is composer-in-residence at Ircam (from December 2016 to March 2017).
Her project Landschaften explores the idea of manipulating, hearing and experiencing sound as a three dimensional body in space. 
This compositional work is exploring this idea of sound as space in itself and the composition as a landscape, travelling through the different dimensions of the sound and it’s inner structures.</description>
    </item>
    <item rdf:about="http://129.102.1.137/efficace/wp/analysis?rev=1490542413&amp;do=diff">
        <dc:format>text/html</dc:format>
        <dc:date>2017-03-26T17:33:33+02:00</dc:date>
        <title>efficace:wp:analysis</title>
        <link>http://129.102.1.137/efficace/wp/analysis?rev=1490542413&amp;do=diff</link>
        <description>Concert: Livre Digital - Fronteiras Musicais: Tecnologias que desafi(n)am os sentidos

Barão Geraldo, SP, Brasil, 28/08/2014




Encerrando 1o Colóquio Franco-Brasileiro de Análise e Criação Musicais com Suporte Computacional, Livre Digital será uma noite dedicada à exploração de fronteiras musicais: improvisação fora de qualquer padrão, explorando novas sonoridades e processos. Tecnologias que desafi(n)am os sentidos, algoritmos que dialogam com instrumentistas humanos. Cabe ao espectador desve…</description>
    </item>
    <item rdf:about="http://129.102.1.137/efficace/wp/heart-rate?rev=1473848011&amp;do=diff">
        <dc:format>text/html</dc:format>
        <dc:date>2016-09-14T12:13:31+02:00</dc:date>
        <title>efficace:wp:heart-rate</title>
        <link>http://129.102.1.137/efficace/wp/heart-rate?rev=1473848011&amp;do=diff</link>
        <description>John MacCallum &amp; Teoma Naccarato

Musical research project - Sept. December 2014, IRCAM

The composer John MacCallum and choreographer Teoma Naccarato propose a collaborative project that examines the use of real-time, heart rate data from contemporary dancers to drive a polytemporal composition for instrumental ensemble with live electronics.</description>
    </item>
    <item rdf:about="http://129.102.1.137/efficace/wp/musical-processes?rev=1463758601&amp;do=diff">
        <dc:format>text/html</dc:format>
        <dc:date>2016-05-20T17:36:41+02:00</dc:date>
        <title>efficace:wp:musical-processes</title>
        <link>http://129.102.1.137/efficace/wp/musical-processes?rev=1463758601&amp;do=diff</link>
        <description>Dynamic Music Generation with Formal Specifications

This example takes place in the context of automatic music generation systems combining formal specifications of temporal structures and interactivity. Such systems find applications for instance in computer improvisation. The objective is to embed agents generating musical material in high-level, formal while interactive time structures. We consider the generation engine of ImproteK, an interactive music system dedicated to guided or composed…</description>
    </item>
    <item rdf:about="http://129.102.1.137/efficace/wp/pom?rev=1513769828&amp;do=diff">
        <dc:format>text/html</dc:format>
        <dc:date>2017-12-20T12:37:08+02:00</dc:date>
        <title>efficace:wp:pom</title>
        <link>http://129.102.1.137/efficace/wp/pom?rev=1513769828&amp;do=diff</link>
        <description>Philippe Leroux, Jérémie Garcia

Quid sit musicus? (2014)

Jeremie Garcia and Philippe Leroux used the OM reactive extension during the composition of Leroux's piece Quid sit musicus? (premiered in June, 2014 at IRCAM during the Manifeste Festival).</description>
    </item>
    <item rdf:about="http://129.102.1.137/efficace/wp/trajectoires?rev=1558944827&amp;do=diff">
        <dc:format>text/html</dc:format>
        <dc:date>2019-05-27T10:13:47+02:00</dc:date>
        <title>efficace:wp:trajectoires</title>
        <link>http://129.102.1.137/efficace/wp/trajectoires?rev=1558944827&amp;do=diff</link>
        <description>Jérémie Garcia, Xavier Favory, Jean Bresson

(2015)

Trajectoires is a mobile application that allows drawing sound sources trajectories.
The trajectories remotely control any spatial audio renderer using the OpenSoundControl protocol.

We are investigating the new possibilities offered by mobile devices such as smartphones and tablets to support sound spatialization control by composers. This project aims at combining gestural input with touch input to draw trajectories and the sensors integrat…</description>
    </item>
</rdf:RDF>
