The Berlin MIR Meetup team wishes you all a good start in 2019!
For the first Berlin MIR meetup of this year and the 17th installment in total we are very excited to announce the following talk:
Philippe Esling and Adrien Bitton (ACIDS/Artificial Creative Intelligence and Data science team at IRCAM):
Variational inference for modeling musical creativity
The research project carried by the ACIDS team at IRCAM seeks to model musical creativity by extending variational learning approaches towards the use of multivariate and multimodal time series. Our major object of study lies in the properties and perception of musical orchestration. Orchestration is the subtle art of writing musical pieces for orchestra, by combining the spectral properties of each instrument to achieve a particular sonic goal. In this context, the multivariate analysis of temporal processes is required given the inherent multidimensional nature of instrumental mixtures. Time series need to be scrutinized at variable time scales (termed here granularities) as a wealth of time scales co-exist in music (from the identity of single notes up to the structure of entire pieces). Furthermore, orchestration lies at the exact intersection between the symbol (musical writing) and signal (audio recording) representations.
After introducing the general framework and multiple state-of-art creative applications done in the past years, we will focus on various applications of the variational learning framework to disentangle factors of audio variation. Hence, we will detail several recent papers produced by our team, allowing to regularize the topology of the latent space based on perceptual criteria, working with both audio waveforms and spectral transforms and performing timbre style transfer between instruments. Finally, we discuss the development of these approaches as creative tools allowing to increase musical creativity in contemporary music and show case-study of recent pieces played at renowned venues.
We will open the discussion to the question of creative intelligence through the analysis of orchestration and how this could give rise to a whole new category of generic creative learning systems.
Philippe Esling received a M.Sc in Acoustics, Signal Processing and Computer Science in 2009 and a PhD on multiobjective time series matching in 2012. He was a post-doctoral fellow in the department of Genetics and Evolution at the University of Geneva in 2012. He is now an associate professor with tenure at IRCAM, Paris 6 since 2013. He authored and co-authored over 20 peer-reviewed journal papers in prestigious journals such as ACM Computing Surveys, Publications of the National Academy of Science, IEEE TSALP and Nucleic Acids Research. He received a young researcher award for his work in audio querying in 2011, a PhD award for his work in multiobjective time series data mining in 2013 and several best paper awards. He developed and released the first computer-aided orchestration software called Orchids, commercialized in 2014, which already has a worldwide community of thousands users and led to musical pieces from renowned composers played at international venues. He is the lead investigator of machine learning applied to music generation and musical orchestration, and directs the recently created Artificial Creative Intelligence and Data Science (ACIDS) team at IRCAM.
Adrien Bitton is currently working on his PhD at IRCAM under supervision of Carlos Agon and Philippe Esling in the ACIDS team. He carried out a research internship at Edinburgh University in the Acoustics and Music Technology group, followed by one year research at the Audio Communication group of the TU Berlin and an internship at NI. His research focuses on deep learning applications to sound synthesis, interaction, audio effects and transfers.
The meetup will be in the last yard before the river:
Claim the event and start manage its content.I am the organizer