15-323 / 15-623
Computer Music Systems and Information Processing

Week 1

NOTE: expect slides to be updated throughout the semester around the time the slides are presented.

Jan 15 (Tues)

Slides

Course Overview

Syllabus

The Big Picture

MIDI Introduction

Concepts

What the course is about

What is required of students

Project work

MIDI represents music performance actions

Notes, note-on, note-off

Timing in MIDI

Jan 17 (Thur)

MIDI Standard

Serpent

Serpent MIDI API

Reading

Midi Standard: David’s MIDI Spec

General MIDI Programs

See also MIDI Tutorial for Programmers

Serpent Style Guide

Concepts

Channels in MIDI

Control change and what control changes affect

How to use Serpent to send MIDI

  ◀ Survey (due Jan 16)

  ◀ Install and Test Serpent (nothing to hand in, but you’ll need this soon)

  ◀ Homework 1 (due Jan 22)

  ◀ Project 1: Output MIDI data (due Jan 29)

  ▶ Survey (due Jan 16)

Week 2

Jan 22 (Tues)

Slides

Discrete Event Simulation

Reading

Introduction to Discrete Event Simulation

Concepts

What is Discrete Event Simulation?

“Events” as timed procedure calls / method invocations

Logical time vs. real time

Scheduling and dispatching as primitives

Creation of timed behaviors using scheduled events

Waiting for enabling conditions: polling and signalling

  ▶ Homework 1 (due Jan 22)

Jan 24 (Thur)

Events in Serpent

Scheduling algorithms

Active Objects

Concepts

Representing events in Serpent

Fast (log n) scheduling and dispatch algorithms

Object protocols for scheduling and timing

Week 3

Jan 29 (Tues)

Slides

Virtual Time and Tempo

Logical (or Virtual) Time Systems

Tempo and Time Maps

Formula Programming Language

Multiple and Nested Logical Time Systems

Precise Timing

Reading

Anderson, D. P. and Kuivila, R. 1990. A system for computer music performance. ACM Trans. Comput. Syst. 8, 1 (Feb. 1990), 56-82.

Concepts

logical time as a specification of desired behavior

tempo as slope of time map

beats as integral of tempo

time as integral of 1/tempo

tempo and control parameters through computation in Formula

nested and multiple tempo

  ▶ Project 1: Output MIDI data (due Jan 29)

  ◀ Homework 2 (due Feb 14)

  ◀ Project 2: Music application using a scheduler and graphical interface (due Feb 14), p2canvas.srp sample code

Jan 31 (Thur)

Event buffers

Latency

Timed Messages

Concepts

forward synchronous systems

event buffers

active objects and their implementation

how do event buffers reduce jitter at the cost of latency?

examples of event buffering in applications, device drivers, hardware.

Week 4

Feb 5 (Tues)

Scheduling from Theory to Practice

(no slides, see reading)

optional: video lecture

Reading

Scheduling and Graphical Interfaces (this is now incorporated into Serpent documentation here: Introduction to Scheduling: Scheduling and MIDI I/O Using Serpent libraries)

Code from “Scheduling and Graphical Interfaces” (also available here in Serpent documentation)

Concepts

Real-time scheduling in Serpent

"Threads" via temporal recursion

Stopping a Thread Pattern

Using virtual time scheduler

Graphical Interface Building

Running schedulers under wxSerpent64

Forward Scheduling with PortMidi

Feb 7 (Thur)

Music Theory

(no slides, see reading)

Reading

An Introduction to Music Concepts

Concepts

Rhythm: beats/quarter notes, meter

Pitch: octaves, scales, intervals

Harmony: triads, root, inversions, voicing

Form: chord sequences, repeat, rondo, song form

Week 5

Feb 12 (Tues)

Slides

Algorithmic Music Composition I
Markov models
Pitch sequences
Pitch + Rhythm
Concurrencies
Music grammars

Concepts

music as time series data

Markov models

estimating transition probabilities

concurrency (Birmingham and Pardo)

music as formal language

hierarchical structure and its relationship to grammars

context sensitive grammars

Feb 14 (Thur)

Algorithmic Music Composition II
Pattern Generators
Suffix trees
Structure
Popular Music Generation

Reading

Elowsson, A. and Friberg, A. “Algorithmic Composition of Popular Music,” in Proceedings of the 12th International Conference on Music Perception and Cognition and the 8th Triennial Conference of the European Society for the Cognitive Sciences of Music, July 2012, pp 276-285

Kohonen, T. “A self-learning musical grammar, or ‘associative memory of the second kind’,” in IJCNN., International Joint Conference on Neural Networks (June 1989), vol 1, pp. 1-5.

Concepts

use of pattern generators in music

unsupervised sequence learning

role of structure in music

  ▶ Homework 2 (due Feb 14)

  ▶ Project 2: Music application using a scheduler and graphical interface (due Feb 14), p2canvas.srp sample code

  ◀ Homework 3 (due Feb 21)

  ◀ Project 3: Adding OSC control (due Feb 28), extras: p3.zip

Week 6

Feb 19 (Tues)

Slides

Open Sound Control and O2

Reading

Matthew Wright, Adrian Freed, Ali Momeni, “OpenSound Control: State of the Art 2003,” in Proceedings of the 2003 Conference on New Interfaces for Musical Expression (NIME-03), Montreal, Canada, 2003, pp 153-159.

Roger B. Dannenberg and Zhang Chi. “O2: Rethinking Open Sound Control,” in Proceedings of the 42nd International Computer Music Conference, Utrecht: HKU University of the Arts Utrecht, 2016, pp. 493-496.

Concepts

OSC & O2 addressing mechanisms

OSC & O2 timing mechanisms

OSC & O2 network addressing

OSC & O2 reply, status, acknowledgements (or lack thereof)

Feb 21 (Thur)

Clock Synchronization

Concepts

clock drift

clock skew

simple clock sync protocol

  ▶ Homework 3 (due Feb 21)

Week 7

Feb 26 (Tues)

Slides

Laptop Orchestras

Music Networks

Sequencers and MIDI Files

Reading

Jorg Stelkens. “peerSynth: A P2P Multi-User Software Synthesizer with new techniques for integrating latency in real time collaboration,” in ICMC 2003 Proceedings, 2003.

Standard Midi Files: Standard Midi File Specification (This pdf was generated from a web page that is no longer online. It has a full MIDI spec including listings of controllers and General MIDI instruments as well as info on Standard Midi Files.)

Concepts

Types and examples of laptop orchestras

Example compositions for laptop orchestras

network latency and musical implications

packets

best effort

detecting lost packets

retransmission

acknowledgement

Standard MIDI File format

Feb 28 (Thur)

Max Family of Programming Languages for Music

Soundcool

Links

cycling74.commaxobjects.compuredata.info

Reading

Watch the video on this page: MAX MSP Overview

Sections 2.1 through 2.8 in Pd Documentation Chapter 2 Theory of Operation

Soundcool Manual

Concepts

Visual Programming

Outlets and Inlets

Static “Data Flow” Graph

Order of Execution

Synchronous Audio Graphs

Using Max/Pd to create sound

  ▶ Project 3: Adding OSC control (due Feb 28), extras: p3.zip

Week 8

Mar 5 (Tues)

Midterm Practice Exam

MIDTERM

  ◀ Project 4: Audio effect (due Apr 2)

Mar 7 (Thurs)

Music Robots

(guest speaker: TBA)

MIDSEMESTER BREAK      Mar 8 (Fri) through Mar 17 (Sun)

Week 9

Mar 19 (Tues)

Slides

Audio Concepts: Samples, Frames, Blocks

Synchronous Processing

Audio System Organization

Reading

Nicola Bernardini and Davide Rocchesso. ”Making Sounds with Numbers: A tutorial on music software dedicated to digital audio.“ In Proceedings of COST G-6 DAFX, 1998.

Ross Bencina and Phil Burk, “PortAudio - an Open Source Cross Platform Audio API.” In Proceedings of the 2001 International Computer Music Conference, International Computer Music Association, 2001.

Optional Reading

Steinberg VST Programming Examples

Concepts

samples, frames, blocks

how is audio signal processing computation organized?

why must audio processing be synchronous?

what does synchronous processing mean?

callback/asynchronous api

blocking/synchronous api

typical scheduling strategies for audio applications

what is the architecture of plug-ins?

how do applications use plug-ins?

Mar 21 (Thur)

slides

Notes

Audio Unit Generator

Software Organization for Music Signal Processing

Web Audio

Reading

Andrej Hronco, “Making Music in the Browser - Web Audio API, Part 1

Concepts

Unit Generators and Object Oriented Programming

Calculation Tree

Order of Execution

Block-by-block Processing of Audio

WebAudio

AudioContext

AudioNode

AudioParam

scheduling

Week 10

Mar 26 (Tues)

Slides

Content-Based Music Information Retrieval

Audio Alignment

Reading

McNab, R. J., Smith, L. A., Witten, I. H., Henderson, C. L., and Cunningham, S. J. 1996. “Towards the digital music library: tune retrieval from acoustic input.” In Proceedings of the First ACM International Conference on Digital Libraries (Bethesda, Maryland, United States, March 20 - 23, 1996). E. A. Fox and G. Marchionini, Eds. DL ’96. ACM Press, New York, NY, 11-18.

Dannenberg, Dynamic Programming for Music Search

Concepts

QBH - what it does

Themes

DP for approximate substring matching

Music fingerprinting - what it does

Music fingerprinting - features

Music fingerprinting - hashing techniques

  ◀ Homework 4 (due Apr 5)

Mar 28 (Thur)

Video: Winkler/Ferrero

TBD - maybe SoundCool, Web Audio

Week 11

Apr 2 (Tues)

Slides

Concurrency Part 1
Real-time
Static priority
Priority inversion

Reading

Two Models for Concurrent Programming

Alex Edwards, Understanding Mutexes

Apr 4 (Thur)

Slides

Video: Max Mathews Radio Baton (in class, started at 9:26)

Concurrency Part 2
Real-time
Static priority
Priority inversion

Concepts

“Classical” synchronization primitives: locks, semaphores

Synchronization with message passing/mailboxes

Real-time issues: blocking, priority inversion

Lock-free synchronization

Earliest deadline first scheduling

  ▶ Project 4: Audio effect (due Apr 2)

  ▶ Homework 4 (due Apr 5)

  ◀ Homework 5 (due Apr 18)

  ◀ Project 5: Interactive Performance (interim report due Apr 11, previews Apr 22-23, concert 3:30PM, Apr 28 (Sun), final report due May 1 p5.zip

Week 12

Apr 9 (Tues)

Slides

Music Representation

Reading

Dannenberg, R. “Music Representation Issues, Techniques, and Systems.” Computer Music Journal, 17(3), pp. 20-30

Buxton, W., Sniderman, R., Reeves, W., Patel, S. & Baecker, R., “The Evolution of the SSSP Score Editing Tools.” In Roads, C. & Strawn, J. (1985). Foundations of Computer Music. MIT Press, Cambridge MA, 376-402.

Concepts

File Formats:
Standard MIDI File format
Allegro file format
Symbolic Notation

Operations on scores:
tempo change
articulation
timbre/instrumentation
transposition
octave doubling
parameter mapping

Hierarchical Scores

Multiple Hierarchies

special purpose vs. general/extensible representations

why is music notation difficult?

how much notation information is in a MIDI file?

scores as data types - operations on scores

hierarchy in music data

  ▶ Project 5: Interactive Performance (interim report due Apr 11, previews Apr 22-23, concert 3:30PM, Apr 28 (Sun), final report due May 1 p5.zip

Spring Carnival (no classes)      Apr 11 (Thu) through Apr 14 (Sun)

Week 13

  ◀ Project 5: Interactive Performance (concert 3:30PM, Apr 28 (Sun)) p5.zip

Apr 16 (Tues)

Roger Linn's directory of New Electronic Musical Instruments

Sensors and Instruments

(guest speaker: Roger Linn)

Concepts

input devices and sensors

computer music instruments

NIME: New Interfaces for Music Expression

  ◀ Project 6: Genre Classifier (due May 3) p6_helper.zip data_genre.zip data_genre_small.zip Note: you can use either data_genre.zip (482MB) or data_genre_small.zip (24MB), but try to use the large one.

Apr 18 (Thur)

Music Representation (continued)

  ▶ Homework 5 (due Apr 18)

Week 14

Apr 23 (Tues)

lecture video

Slides

Music Understanding

Human-Computer Music Performance

Reading

Dannenberg, R. and Raphael, C., “Music Score Alignment and Computer Accompaniment,“ in Communications of the ACM, 49(8) (August 2006), pp. 38-43.

Concepts

computer accompaniment

score following

monophonic melodies

polyphonic keyboard performance

vocal performance

accompaniment synchronization

score is known, timing is unknown

performance error and robust matching

dynamic programming

probabilistic score following techniques

semi-autonomous accompaniment generation

How does HCMP differ from Computer Accompaniment?

What are the challenges for HCMP?

The beats, measures, cues model of synchronization

How does HCMP use virtual time, forward synchrony, etc.?

machine learning of style recognition

onset detection

Apr 25 (Thur)

Slides

Music Features
Key estimation
Chord recognition
Beat detection

Reading

Dannenberg, Thom, and Watson, “A Machine Learning Approach to Musical Style Recognition” in 1997 International Computer Music Conference, International Computer Music Association (September 1997), pp. 344-347.

Concepts

chroma vectors

chromagram

principles of beat detection

tempo estimation vs beat tracking

  ▶ Project 5: Interactive Performance (preview Apr 22 & 23) p5.zip

Apr 27 (Sat)

Rehearsals

Rehearsal: 1:30-3PM in GHC 7208 (same room as previews)

Rehearsals are optional, but a good place to test and figure out how to best feature your work in the concert. I hope you can make one of them.

3:30PM, Apr 28 (Sun)

Concert

load-in: 10am-12 (GHC 7208)

setup: 12-1:30pm (STUDIO, CFA 111)


class arrives, plug in: 1:30pm


run-through: 2pm

doors open to public: 3pm


Concert, 3:30pm, invite your friends!


load-out: 4:30-6pm

Concert location: Frank-Ratchye STUDIO for Creative Inquiry, CFA 111

Guest (semi)Conductor: Brian Riordan, award-winning composer and Ph.D. candidate, University of Pittsburgh








Week 15

Apr 30 (Tues)

Slides

Audio Editors

Concepts

How Audacity Works

Efficient editing structures

Destructive/Non-destructive editing

  ▶ Project 5 Final Report due May 1 p5.zip

May 2 (Thurs)

Slides

Semester Review

Practice Exam

 

  ▶ Project 6: Genre Classifier (due May 3) p6_helper.zip data_genre.zip data_genre_small.zip Note: you can use either data_genre.zip (482MB) or data_genre_small.zip (24MB), but try to use the large one.

Final Exam: Time: May 10, 5:30-8:30pm, Location: POS 152