Supriya Ghosh (Editor)

Music and artificial intelligence

Updated on
Edit
Like
Comment
Share on FacebookTweet on TwitterShare on LinkedInShare on Reddit

Research in artificial intelligence (AI) is known to have impacted medical diagnosis, stock trading, robot control, and several other fields. Perhaps less popular is the contribution of AI in the field of music. Nevertheless, artificial intelligence and music (AIM) has, for a long time, been a common subject in several conferences and workshops, including the International Computer Music Conference, the Computing Society Conference and the International Joint Conference on Artificial Intelligence. In fact, the first International Computer Music Conference was the ICMC 1974, Michigan State University, East Lansing, USA Current research includes the application of AI in music composition, performance, theory and digital sound processing. Several music software applications have been developed that use AI to produce music. A few examples are included below. Note that there are many that are still being developed.

Contents

History

In 1960, Russian researcher R.Kh.Zaripov published worldwide first paper on algorithmic music composing using the "Ural-1" computer.

In 1965, inventor Ray Kurzweil premiered a piano piece created by a computer that was capable of pattern recognition in various compositions. The computer was then able to analyze and use these patterns to create novel melodies. The computer was debuted on Steve Allen's I've Got a Secret program, and stumped the hosts until film star Henry Morgan guessed Ray's secret.

Orb Composer

A program developed by Hexachords and directed by Richard Portelli, mainly focused on orchestral music.

EMI

A program developed by David Cope which composes classical music. See Experiments in Musical Intelligence. Emily Howell is an interactive augmentation of EMI. (As a popular example, the background music of the viral video Humans Need Not Apply was created by "her", as revealed in the video to illustrate the likely fate of creative jobs.)

OrchExtra

This program was designed to provide small-budget productions with instrumentation for all instruments usually present in the full-fledged orchestra. If there is a small orchestra playing, the program can play the part for missing instruments. High school and community theaters wanting to produce a musical can now benefit from the virtual orchestra and realize a full Broadway score. This software is able to follow the fluctuations in tempo and musical expression. Musicians enjoy the thrill of playing with a full orchestra, while the audience enjoys the rich sound that comes from the combination of the virtual orchestra with the musicians.

Demo:

Computer Accompaniment (Carnegie Mellon University)

The Computer Music Project at CMU develops computer music and interactive performance technology to enhance human musical experience and creativity. This interdisciplinary effort draws on Music Theory, Cognitive Science, Artificial Intelligence and Machine Learning, Human Computer Interaction, Real-Time Systems, Computer Graphics and Animation, Multimedia, Programming Languages, and Signal Processing. One of their project is similar to SmartMusic. It provides accompaniment for the chosen piece follows the soloist (user) despite tempo changes and/or mistakes.

Demo:

SmartMusic

SmartMusic is an interactive, computer-based practice tool for musicians. It offers exercises, instant feedback tools, and accompaniments meant to aid musicians. The product is targeted at teachers and students alike and offers five categories of accompaniments: solo, skill development, method books, jazz, and ensemble. Teachers can give students pre-defined assignments via email and scan in sheet music that is not yet in the SmartMusic catalog. Students can choose the difficulty level they want to play at, slow down or speed up the tempo, or change the key in which to play the piece. SmartMusic also compares students' playing with digital template, which allows it to detect mistakes and mark them on a score. It also simulates the rapport between musicians by sensing and reacting to tempo changes.

StarPlayIt

StarPlay is also a music education software that allows the user to practice by performing with professional musicians, bands and orchestras. They can choose their spot and watch the video from that spot. They can hear the other musicians playing. Again, the program listens to the user's performance and helps them improve their performance by providing constructive feedback as they rehearse. StarPlay was developed by StarPlayIt (formerly In The Chair), a music technology company that has won many awards for its platforms for online musical performance and participation.

ChucK

Developed at Princeton University by Ge Wang and Perry Cook, ChucK is a text-based, cross-platform language that allows real-time synthesis, composition, performance and analysis of music. . It is used by SLOrk (Stanford Laptop Orchestra) and PLOrk (Princeton Laptop Orchestra).

Demo:

Impromptu

The Impromptu media programming environment was developed by Andrew Sorensen for exploring 'intelligent' interactive music and visual systems. Impromptu is used for live coding performances and research including generative orchestral music and computational models of music perception.

REAPER's TabEditor

MIDI to string instrument (guitar, violin, dombra, etc.) tablature conversion is a nontrivial task, as the same note can reside on different strings of the instrument. And the creation of good fingering is sometimes a challenge even for real musicians, especially when translating a two handed piano composition on a string instrument. So in TabEditor (the tiny plugin for REAPER DAW), an AI was used that solves this puzzle the same way as a musician would: trying to keep all the notes close to each other (to be possible to play) while trying to fit all the piano notes into a range that can be played simultaneously on the instrument. When direct translation is impossible (piano part has more notes than are possible on the guitar) the AI tries to find an acceptable solution, removing as few notes as possible from the original composition. The Prolog programming language was used to create this AI.

Ludwig

Ludwig is an automated composition software based on tree search algorithms. Ludwig generates melodies according to principles of classical music theory. The software arranges its melodies with pop-automation patterns or in four-part choral writing. Ludwig can react in real-time on an eight-bar theme played on a keyboard. The theme will be analysed for key, harmonic content and rhythm while it is being performed by a human. The program then without delay repeats the theme arranged e.g. for orchestra. It subsequently varies the melody to create a little piece as interactive answer to the human input.

OMax

OMax is a software environment which learns in real-time typical features of a musician's style and plays along with him interactively, giving the flavor of a machine co-improvisation. OMax uses OpenMusic and Max. It is based on researches on stylistic modeling carried out by Gerard Assayag and Shlomo Dubnov and on researches on improvisation with the computer by G. Assayag, M. Chemillier and G. Bloch (Aka the OMax Brothers) in the Ircam Music Representations group.

Melomics

Melomics is a proprietary computational system for the automatic (without human intervention) composition of music, based on bioinspired methods and produced by Melomics Media. Composing a wide variety of genres, all music composed by Melomics algorithms are available in MP3, MIDI, MusicXML, and PDF (of sheet music), after purchase. Music composed by this algorithm was organized into an album named Iamus (album), which was hailed by New Scientist as "The first complete album to be composed solely by a computer and recorded by human musicians."

Flow Machines

Flow Machines is a research project funded by the European Research Council (ERC) and led by François Pachet. Flow Machines aims at transforming musical style into a computational object to apply to AI-generated melodies and harmonies. Flow Machines has composed two fully-fledged pop songs, issued from a collaboration between the AI software and pop composer Benoît Carré: Daddy's Car and Mister Shadow. Flow Machines also produced DeepBach, a neural network system which produces harmonisation in Bach style indiscernable from original Bach's harmonisations.

References

Music and artificial intelligence Wikipedia