Open Access Open Access  Restricted Access Subscription or Fee Access

Acoustic Quality Parameters Used for Error Evaluation of Neural Networks Modeling for HRIRs Applied to Escape Training in Blind Conditions


(*) Corresponding author


Authors' affiliations


DOI: https://doi.org/10.15866/ireche.v5i6.6947

Abstract


This work presents an accuracy analysis of a system for Head-Related Impulse Responses interpolation based on artificial neural networks (ANN). The error analysis behavior in the time domain, between the target and the output functions, could lead to mistaken results if the functions were slightly time-shifted. On the other hand, frequency domain errors may be influenced, for instance, by high frequency components, where the resolution is smaller than at lower frequencies and at frequency ranges that does not interfere in the human being perception. The proposed criteria are based on adequating well known room acoustic quality parameters for evaluating the interpolation error. The article presents the ANN architecture and the optimal configuration (in terms of processing speed and accuracy) based on the proposed error criteria. The comparative results were obtained from the receiving area where human auditive capability is most sensible and the interaural differences are most notorious (horizontal plane). The results show that the modified acoustic parameters provide a good estimation of HRIR interpolation comparisons.
Copyright © 2013 Praise Worthy Prize - All rights reserved.

Keywords


HRIR Interpolation; Artificial Neural Networks; Acoustic Quality Parameters

Full Text:

PDF


References


G. Wersnyi, Effect of emulated head-tracking for reducing localization errors in virtual audio simulation. IEEE Transactions On Audio, Speech, And Language Processing, vol. 17, n. 2, (2009) pp. 247–252.
http://dx.doi.org/10.1109/tasl.2008.2006720

T. Ajdler, C. Faller, L. Sbaiz & M. Vetterli, Sound Field Analysis along a Circle and its Applications to HRTF Interpolation. J. Aud. Eng. Soc., vol. 56, n. 3, (2005) pp 156–175.

H. Hacihabiboglu, B. Gunel, & A. M. Kondoz, Head-related transfer function filter interpolation by root displacement. In Proc. IEEE Workshop on Applications of Signal Processing to Audio and Acoustics (WASPAA-05), New Paltz, NY, USA, (2005) pp 134–137.
http://dx.doi.org/10.1109/aspaa.2005.1540187

T. Nishino, N. Inoue, K. Takeda & F. Itakura, Estimation of HRTFs on the horizontal plane using physical features. Applied Acoustics, vol. 68, (2007) pp 897–908.
http://dx.doi.org/10.1016/j.apacoust.2006.12.010

N. H. Adams & G. H. Wakefield, State-space synthesis of virtual auditory space., IEEE Transactions On Audio, Speech, And Language Processing, vol. 16, n. 5, pp 881–890 (2008).
http://dx.doi.org/10.1109/tasl.2008.924151

J. F. Lucio Naranjo, R. A. Tenenbaum & J. C. B. Torres, Using Artificial Neural Networks to generate virtual acoustic reality applied on escape training in blind conditions, (2010) International Review of Chemical Engineering (IRECHE.), 2 (6), pp. 754-759.

J. Blauert, Spatial Hearing (The MIT Press, Cambridge, 1997).

B. Gardner, K. Martin, HRTF Measurements of a KEMAR Dummy-Head Microphone. J. Acoust. Soc. Am., vol. 97, n. 6, (1995) pp. 3907–3908.
http://dx.doi.org/10.1121/1.412407

J. C. B. Torres, M. R. Petraglia & R. A. Tenenbaum, An Efficient wavelet-based HRTF for auralization, Acta Acustica united with Acustica, vol. 90, n. 1 (2004).

ISO3382, Acoustics - measurement of the reverberation time of rooms with reference to other acoustical parameters (1997)
http://dx.doi.org/10.3403/02008625


Refbacks

  • There are currently no refbacks.



Please send any question about this web site to info@praiseworthyprize.com
Copyright © 2005-2024 Praise Worthy Prize