Adomavicius, G., Bockstedt, J., Gupta, A., & Kauffman, R. J. (2006). Understanding patterns of Technology Evolution: An Ecosystem Perspective. Hawaii International Conference on System Sciences. Minneapolis.
Barrass, S. (2005). A perceptual Framework for the Auditory Display of Scientific Data. Canberra, Australia: University of Canberra.
Bauck, J., & H. Cooper, D. (2002). Developments in Transaural stero . IEEE. Minneapolis: IEEE.
Braasch, J., Peters, N., & Valente, D. L. (2008). A loudspeaker-based projection technique for spatial music applications using Virtual Microphone Control. Computer Music Journal , 32 (3), 55-71.
Corteel, E., & Caulkins, T. (2004). Sound scene creation and manipulation using Wave Field Synthesis. IRCAM. Paris: IRCAM.
Cullen, C. (2006). The Sonic Representation of Mathematical Data. Doctoral Thesis, Dublin Institute of Technology, Dublin.
Cycling’74. (2011). Cycling ’74. Retrieved 08 15, 2011, from Cycling ’74: http://cycling74.com/
Hermann, T. (2008). Taxonomy and definitions for Sonification and Auditory Display. International Conference on Auditory Displays – ICAD. Paris: Faculty of Technology, Bielefeld University .
Ibrahim, A. A., & Hunt, A. (2006). A general HCI framework of Sonification applications. International ACM ACM SIGACCESS. New York.
ICAD. (2011). International Community for Auditory Display. Retrieved 8 15, 2011, from International Community for Auditory Display: http://www.icad.org/
Jamoma. (2011). Jamoma – A plattfomr for interactive art-based research and performance. Retrieved 8 18, 2011, from Jamoma: http://jamoma.org/
Kemp, J. (2011). Finite Difference Simple Harmonic Oscillator. Edinburgh, Scotland: Universithy of Edinburgh.
Kemp, J. (2011). Image Source Reverberator. Edinburgh, Scotland: University of Edinburgh.
Kemp, J. (2011). Matlab Schroeder Reverberator. Edinburgh, Sctoland: University of Edinburgh.
Krikorian, R., & al., e. (2001). Localization of sound in Micro-Gravity. Retrieved 08 15, 2011, from http://web.media.mit.edu/~raffik/zero-g/aup/final.html
Lossius. (2008). Controlling spatial sound within an installation art context. International Computer Music Conference. Bergen.
Maeder, M. (2011). Home. Retrieved 6 23, 2011, from trees: http://blog.zhdk.ch/marcusmaeder/
Malham, D. G., & Myatt, A. (1995). 3-D sound Spatialization using Ambisonic Techniques. Computer Music Journal , 19 (4).
McLuhan, M. (2004). Visual and Acoustic Space. In C. Cox, & D. Warner, Audio Culture, Readings in Modern Music. London: Continuum.
Morville, P. (2004). A brief history of information architecture. In A. Gilshrist, & B. Mahon, Information Architecture, designing information environments for purpose. London, UK: Facet Publishing.
Nasir, T., & Roberts, J. C. (2007). Sonification of spatial data. International Conference on Auditory Display. Montreal: ICAD.
Parker, M. (2010). mp.assignment2. Retrieved 5 15, 2011, from Max Help: http://sd.caad.ed.ac.uk/maxhelp/2010/07/mp-assignment2/
Pauletto, S. (2009). Interactive sonification of complex data. Internatioal Journal of Human-Computer Studies .
Peters, N. (2008). Proposing SpatDIF – The Spatial sound Description Interchange Format. ICMC. Moontreal.
Peters, N. (2010). Sweet (re)production: Developing sound spatialization tools for musical applications with emphasis on sweet spot and off-center perception. Doctoral Thesis, McGill University, Montreal.
Peters, N. (2009). ViMiC – Virtual Microphone Control for Max/MSP. Jamoma.
Peters, N., Ferguson, S., & McAdams, S. (2007). Towards a Spatial sound Description Interchange Format (SpatDIF). CIRMMT, Montreal.
Peters, N., Lossius, T., Schacher, J., Baltazar, P., Bascou, C., & Place, T. (2009). A stratified approach for sound Spatialization. Sound and Music Computing Conference. Porto.
Peters, N., Matthews, T., Braasch, J., & Stephen, M. (2008). Spatial sound rendering in Max/MSP with ViMiC. International Computer Music Conference. Montreal and Troy.
Pulkki, V. (2002). Compensating displacement of amplitude-panned virtual sources. Helsinki: AES.
Pulkki, V. (2001). Spatial sound generation and perception by Amplitude Panning Techniques. Helsinki University of Technology, Helsinki.
Rayleigh, L. (1876). On our perception of the direction of a source of sound. Taylor & Francis.
Saue, S. (2000). A model for interaction in exploratory sonification displays. ICAD proceedings, Norwegian University of Science and Technology, Departmen of Telecommunications, Acousitics, Trondhem.
Shure. (2011). Microphones: Polar pattern / Directionality. Retrieved 08 17, 2011, from Shure: http://www.shure.co.uk/support_download/educational_content/microphones-basics/microphone_polar_patterns
Sima, S. (2008). HRTF Measurements and filter design for a headphone-based 3D audio system. Thesis, Hamburg.
Sonifyer.org. (2011). Sonifyer.org. Retrieved 8 15, 2011, from Sonifyer.org: http://www.sonifyer.org/?lang=en
Stefani, E., & Mooney, J. (2009). Spatial composition in the multi-channel domain: Aesthetics and Techniques. International Computer Music Conference. Montreal.
Stern, R. M., Wang, D. L., & Brown, G. (2006). Binaural Sound Localization. In D. L. Wang, & G. Brown, Computational Auditory Scen Analysis. New York, UEA: IEEE Press.
Vickers, P. (2004). Caitlin – Auditory External Representation of Programs. Retrieved 7 20, 2011, from auralisation.org: http://computing.unn.ac.uk/staff/cgpv1/caitlin/
Vogt, K. (2008). Sonification and particle physics. Karl-Franzens-University; University for Music and Dramatic Arts, Theoretical Physics; Electronic Music and Acoustics, Graz.
Walker, B. N., & Kramer, G. (2006). Auditory Displays Article xx: Sonification. Georgia Institute of Technology, School of Psychology and College of Computing. Atlanta: Georgia Institute of Technology.
Winter, M. (2005). On Sonification. Los Angeles, UEA.
Zhou, Z. H. (n.d.). Sound localization and virtual auditory space. Toronto: University of Toronto.
…
0. CONTENT