A Virtual Xylophone for Music Education
Abstract
This paper describes the implementation of a virtual xylophone. During a setup phase, the program registers a background depth image, generated by a Kinect sensor, and the user interacts with the program to identify the color of tone bars and to select a restricted track space for tracking mallet locations. During a play phase, the program tracks mallet heads by locating pixels that are in front of the pixels registered in the background image. The program can easily be modified to restrict the notes available to the player or to use pentatonic or other musical scales.
Department(s)
Computer Science
Document Type
Conference Proceeding
DOI
https://doi.org/10.1109/ism.2016.0094
Publication Date
2016
Recommended Citation
Burks, Nikolas, Lloyd Smith, and Jamil Saquer. "A virtual xylophone for music education." In 2016 IEEE International Symposium on Multimedia (ISM), pp. 409-410. IEEE, 2016.