This year has been especially busy, where among a range of activities, I have participated in 5 panels (all related to music technology)! This blog post summarizes my participation and insights on the second last panel of the year “Future Of The Music Industries” (November 15, 2018) with Joe Lyske (chair, MXX), Jesper Skibsby (panelist, WARM), Nick Breen (panelist, Reed Smith) and I (panelist, Norwegian University of Science and Technology (NTNU)) during the Resonate Music Industry Conference 2018, held at Barras Art and Design (BAAD), Glasgow, Scotland, UK. The latest panel of the year that I participated was held on November 21, 2018, at the Sonic Arts Research Center, and it was about “Women in Music Technology around the World” (full video available online), which is somehow related to this blog post (see “Bringing more diversity”).
Resonate 2018 is a music industry conference that combines keynotes, panels, workshops, and drop ins related to industry training and business development. It is organized by 23rd Precinct Music, a music publishing company since 1990s. The event both promotes and connects music industry organizations mainly from Scotland and the UK. The panels of the conference covered a range of topics from “Benefiting from the Music Conferences”, to “Scotland and Emerging Markets”, “What you Syncing 2.0” and “Future Of The Music Industries”.
Our panel meant to bring some insights on newly emerging technologies that could be affecting music industry businesses and to bring some ideas on how to embrace them to their benefit. The panel was diverse representing four lenses in the music industry: artificial intelligence in music production, intelligent broadcasting, digital licensing, and research in new technologies to create music. Dr Joe Lyske, CEO and Co-founder of the creative AI technology company MXX Music, took the role of chairing the panel, and did a great job of raising topics of discussion related to our respective backgrounds and of interest to the audience. It was also a surprise to find out that we both have been in two same institutions, namely Georgia Tech and Queen Mary University of London (QMUL), but at different times! Jesper Skibsby is founder of World Airplay Radio Monitor (WARM), a large-scale radio monitoring service focusing on the individuals in the music industry. He has broad experience in music industry and labels and is a board member of DUP (association for independent record labels in Denmark). Nick Breen is a senior associate in the Entertainment and Media Industry Group at law firm Reed Smith, and his expertise relates to digital media, music, advertising and content distribution specializing in digital distribution of content. As an associate professor in music technology at NTNU, I represented the field of research in new interfaces for musical expression, bringing my experience from Universitat Pompeu Fabra, Georgia Tech, QMUL and NTNU.
Overview of past technologies that impacted the industry
Previously to the panel’s day, Joe Lyske led an online discussion that helped us to frame the panel’s themes. I particularly found interesting the overview of the past technologies that impacted the industry:
- 1990s Winamp by Justin Frankel, which changed the way of experiencing digital music;
- P2P file sharing e.g. Napster, which changed the way of consuming music;
- iTunes; which changed the selling unit from album to song and adapted the traditional business model to digital media;
- streaming / the internet, which brought models of radio online streaming, online network music, online distribution, and automatic playlist generation; and
- mobile apps, giving empowerment to the users on the way they are consuming and creating music.
A common pattern over this history is the development of new experiences/technologies that become popular (from innovators, to early adopters to early majority and so on, according to the technology adoption life cycle), to the extent that achieve a notable social and economic impact.
Benefits of AI and machine learning for the music industry
However, during the panel’s day we kicked off directly talking about present topics, including artificial intelligence (AI), virtual reality, smart speakers, wearable tech, auditory cognitive psychology, and so on. We tackled AI from the perspective of how AI can empower musicians similar to what happened with creative coding platforms such as Processing and openFrameworks that empowered artists to produce creative programs. We discussed about interactive machine learning (IML) (machine learning involving interactions with humans) and existing easy-to-use tools available that allow users to be creative with AI and machine learning (see Rebecca Fiebrink’s Wekinator). This can impact new ways of both music production and performance.
We also discussed songwriting with AI and potential licensing issues (see “Future of digital licensing and distribution”). There is a tradition in computer music of creating algorithmic compositions using AI methods, e.g. David Cope and his EMI (experiments in musical intelligence) and the production of albums by pushing a button, and more recently Flow Machines’s project by François Pachet, for example with the production of the Hello world by SKYGEE. However, we reckoned that there is still manual labor to be considered truly AI songwriting.
The disappearing boundaries between listener and creator
We talked about participatory mobile music and how the roles are becoming more interchangeable between listeners and creators mediated by their mobile phones. The Web Audio Conference (WAC) is an annual conference that showcases the latest in web audio technologies through talks, demos and concerts, and it is the perfect venue to get exposed to the potential of this technology. It is noteworthy that NTNU will host the WAC 2019 conference on December, 4-6, 2019! Similarly, we discussed about the potential of collaboration and how new technologies support new ways of collaborating in music production and performance. An example is our Music Communication, and Technology’s master program (NTNU-UiO), in which there are two replicas of a physical space, “The Portal”, where real-time audiovisual communication is possible at a very low latency, and therefore it is a test-bed space to explore remote and co-located collaboration applied to both music production and music performance.
Future of digital licensing and distribution
When talking about AI, there seemed to be more questions than answers in relation to future licensing and distribution. For example:
- Who has the authorship of AI and music? Who is the author of a track created using AI methods? Can an AI agent be a legal creator of a track? Can it be brought to court for copyright infringement?
- Should AI engines be considered “fake artists”? Where are the boundaries between real artists and fake/ghost artists? What makes us generate the question in the first place, the final artistic product or the methods used?
- Can/should a song be licensed for AI purposes so that it can become part of a “corpus” that informs a next round of songs? What are the benefits for the original artist? This sounds similar to the advent of hip hop sampling in the 1980s, and the need to revise regulations about licensing and distribution to accommodate new ways of music creation.
We also discussed about the Music Modernization Act and other laws that aim to modernize copyright-related issues for music and audio recordings due to new forms of technology like digital streaming. It was interesting to bring the EU-funded project Audio Commons into the forefront, which looks into providing an ecosystem of technologies, services and users around crowdsourced Creative Commons (CC) licensed audio material. This ecosystem is designed to bring CC-licensed audio to creative workflows in audiovisual production e.g. digital audio workstations. We also commented about the potential of blockchains as a platform that offers a new form of distribution with alternative payment (e.g. Imogen Heap’s Mycelia). However, blockchains are still non scalable: they are both computationally and energetically very demanding.
Bringing more diversity
Finally we talked about diversity, as a beneficial means of bringing new technologies into music creation. Interestingly, diversity was discussed in the “Opening Keynote with Maggie Crowe” and the previous panel “Benefitting from Music Conferences”. After our panel, there was a presentation by Neil Patterson from the organisation Drake Music Scotland, which provides music making opportunities for people with disabilities. Neil suggested that accessible stages in the first place would make a difference of welcoming disable performers onstage.
All together, it indicates that it is a positive moment in music technology to take actions on bringing more diversity, both in research and industry, as I already discussed in my blog post Equality, Diversity, Gender in Music Technology. However, in the panel “Benefitting from Music Conferences” the flag was raised that women should accept more the invitations to participate in panels about music technology, otherwise it is difficult to make the snowball roll faster.
During this event I have met a number of people from the music industry in Scotland and the UK and realized how easy it is to live in our own bubbles. Beyond the potential between industry and research’s goals, there can always be paths of collaboration and partnership if both sides are interested. My experience with the EU-funded project AudioCommons has shown that the collaboration between academia (Universitat Pompeu Fabra, Queen Mary University of London, University of Surrey) and industry partners (Waves, Jamendo, AudioGaming) can be mutually beneficial. Research and innovation programmes such as the EU Horizon 2020 are the perfect spaces to start initiatives of this nature.
Acknowledgments: Thank you to all the team from 23rd Precinct Music, who were welcoming and made me feel home in Glasgow! Special thanks to Kayleigh McLaughlan and Susan Montgomery for their invitation, help and support.
Note: The ideas of this text were presented in Panel: Future Of The Music Industries with Joe Lyske (chair, MXX), Jesper Skibsby (panelist, WARM), Nick Breen (panelist, Reed Smith) and Anna Xambó (panelist, NTNU). Resonate Music Conference 2018. Barras Art and Design (BAAD), Glasgow, Scotland, UK.