ABSTRACT
Music listening is an important activity for many people. Advances in technology have made possible the creation of music collections with thousands of songs in portable music players. Navigating these large music collections is challenging especially for users with vision and/or motion disabilities. In this paper we describe our current efforts to build effective music browsing interfaces for people with disabilities. The foundation of our approach is the automatic extraction of features for describing musical content and the use of self-organizing maps to create two-dimensional representations of music collections. The ultimate goal is effective browsing without using any meta-data. We also describe different control interfaces to the system: a regular desktop application, an iPhone implementation, an eye tracker, and a smart room interface based on Wii-mote tracking.
- M. Betke. The Camera Mouse: Visual Tracking of Body features to Provide Computer Access for People With Severe Disabilites. IEEE Trans, on Neural Systems and Rehabilitation, 10(1), March 2002.Google Scholar
- R. Bostelman and J. Albus. Sensor Experiments to Facilitate Robot use in Assistive Environments. In Proc. of the 1st Int. Conference on Pervasive Technologies related to Assistive Environments (PETRAE). ACM, 2008. Google ScholarDigital Library
- C. Doukas and I. Maglogiannis. Enabling human status awareness in assistive environments based on advanced sound and motion data classification. In Proc. of the 1st Int. Conference on Pervasive Technologies related to Assistive Environments (PETRAE). ACM, 2008. Google ScholarDigital Library
- Y. Ebisawa and S.-i. Satoh. Effectiveness of pupil area detection technique using two light sources and image difference method. Engineering in Medicine and Biology Society, 1993. Proceedings of the 15th Annual International Conference of the IEEE, pages 1268--1269, 1993.Google ScholarCross Ref
- M. Goto and T. Goto. Musicream: New music playback interface for streaming, sticking, sorting, and recalling musical pieces. In In Proc. of the 6th International Conference on Music Information Retrieval, pages 404--411, 2005.Google Scholar
- O. Goussevskaia, M. Kuhn, and R. Wattenhofer. Exploring Music Collections on Mobile Devices. In Proc. MobileHCI, pages 359--362, Amsterdams, the Netherlands, 2008. Google ScholarDigital Library
- H. Ishii and B. Ullmer. Tangible bits: towards seamless interfaces between people, bits and atoms. In CHI '97: Proc. of the SIGCHI conference on Human factors in computing systems, pages 234--241, New York, NY, USA, 1997. ACM. Google ScholarDigital Library
- S. Jordà, M. Kaltenbrunner, G. Geiger, and R. Bencina. The reactable*. In Proc. of the International Computer Music Conference (ICMC 2005), Barcelona, Spain, 2005.Google Scholar
- T. Kohonen. Self-Organizing Maps, volume 30 of Springer Series in Information Sciences. Springer, Berlin, Heidelberg, 1995. (Second Extended Edition 1997). Google ScholarDigital Library
- F. Mörchen, A. Ultsch, M. Nöcker, and C. Stamm. Databionic visualization of music collections according to perceptual distance. In In Proc. of the 6th International Conference on Music Information Retrieval (ISMIR?05, pages 396--403, 2005.Google Scholar
- J. Murdoch and G. Tzanetakis. Interactive content-aware music browsing using the radio drum. In Proc. IEEE International Conference on Multimedia and Expo(ICME), pages 937--940, Toronto, Canada, 2006.Google ScholarCross Ref
- N. Orio. Music retrieval: A tutorial and review. Foundations and Trends in Information Retrieval, 1(1):1--90, 2006. Google ScholarDigital Library
- E. Pampalk. Islands of Music. PhD thesis, Vienna University of Technology, 2001.Google Scholar
- A. Savidis, C. Stephanidis, A. Korte, K. Crispien, and K. Fellbaum. A generic direct-manipulation 3d-auditory environment for hierarchical navigation in non-visual interaction. In Assets '96: Proceedings of the second annual ACM conference on Assistive technologies, pages 117--123, New York, NY, USA, 1996. ACM. Google ScholarDigital Library
- I. Stavness, J. Gluck, L. Vilhan, and S. Fels. The musictable: A map-based ubiquitous system for social interaction with a digital music collection. In ICEC, pages 291--302, 2005. Google ScholarDigital Library
- G. Tzanetakis. Marsyas-0.2: A case study in implementing music information retrieval systems, chapter 2, pages 31--49. Intelligent Music Information Systems: Tools and Methodologies. Information Science Reference, 2008. Shen, Shepherd, Cui, Liu (eds).Google ScholarCross Ref
- G. Tzanetakis and P. Cook. Musical Genre Classification of Audio Signals. IEEE Trans, on Speech and Audio Processing, 10(5), July 2002.Google Scholar
Index Terms
- Assistive music browsing using self-organizing maps
Recommendations
Improving automatic music tag annotation using stacked generalization of probabilistic SVM outputs
MM '09: Proceedings of the 17th ACM international conference on MultimediaMusic listeners frequently use words to describe music. Personalized music recommendation systems such as Last.fm and Pandora rely on manual annotations (tags) as a mechanism for querying and navigating large music collections. A well-known issue in such ...
Computational Analysis of Jazz Music: Estimating Tonality through Chord Progression Distances
CSAE '23: Proceedings of the 7th International Conference on Computer Science and Application EngineeringCurrently, research in music informatics focuses extensively on music theory, particularly on the theoretical systems of Western classical music dating back to the 19th century. However, contemporary popular music genres such as pop, rock, and jazz often ...
Multimodal presentation and browsing of music
ICMI '08: Proceedings of the 10th international conference on Multimodal interfacesRecent digitization efforts have led to large music collections, which contain music documents of various modes comprising textual, visual and acoustic data. In this paper, we present a multimodal music player for presenting and browsing digitized music ...
Comments