Hearing instrument testing, also known as test box measurements or HIT box measurements, are measurements that assess whether a hearing instrument’s performance meets manufacturer specifications within defined tolerances.
This article will provide an overview of the range of technical and clinical HIT measurements which are commonly used and how you can use the information they provide in clinical practice.
Hearing instrument test box measurements fall into two main categories.
The first is clinical HIT. These measurements produce a record of a user’s hearing instrument frequency response over time and provide the opportunity for quick and easy monitoring of hearing aid performance. Clinical HIT is primarily compared against the user’s own amplification history, or against the patient’s calculated prescription. The information gathered from clinical HIT measurements can be useful to share with other professionals involved in the care of the end user.
The second is technical HIT. These measurements verify that a hearing instrument is working as it should against the manufacturer’s specifications. For technical HIT measurements, the hearing aid is set to either full-on gain or a pre-defined reference test gain. The instrument being tested should fall within tolerances set out in the relevant international standards which are detailed below.
To perform HIT box measurements, you need a:
You also need a sound-treated test chamber, of which Interacoustics offer the following (Table 1).
Product | Type |
Affinity Compact | Hearing aid fitting system |
Affinity 2.0 | Hearing aid fitting system |
TBS10 | Hearing aid test box |
TBS25 | Hearing aid test box |
Table 1: Sound-treated test chambers offered by Interacoustics.
Hearing instruments are assessed against two main sets of standards: ANSI/ASA S3.22 and IEC 60118-0.
Both of these standards set out a series of parameters, which cover the:
Before any hearing aid is sent out from the factory, they are checked and measured against the standards.
HIT box measurements are a crucial step at many stages of an individual’s fitting journey.
Hearing instruments in transit are exposed to a range of conditions including fluctuations in weather, temperature, and vibrations. Hearing instruments can be sensitive to these conditions, and it can therefore be sensible to consider performing some basic quality-assurance HIT measurements at the point of hearing aid receipt. This can help to catch these types of issues which arise from transit.
Once you have fitted a patient with a hearing device, it is advisable to begin documenting their amplification history through the use of clinical HIT measurements. Having this information available to use alongside the patient’s self-reported experience at time of further follow-up can help to inform your further management decisions.
You can use both clinical and technical HIT as excellent troubleshooting tools, such as when a patient returns to clinic reporting a suspected fault with the hearing aid, distortion, or reduced output.
The clinical HIT is a test-box measurement protocol which uses the end-user’s hearing instrument amplification settings to create a quantitative record of the user’s hearing instrument amplification output. This extra step of good record keeping can allow for an extra level of monitoring to:
The clinical HIT consists of four main measurements, being a record of hearing aid output in response to three different input levels, and finally a measure of maximum output sound pressure level (MOSPL). The use of the test box makes these measurements controlled and repeatable.
You can learn more about clinical HIT box measurements by watching the video below, which includes demonstrations.
The most common technical HIT box measurements are:
We’ll describe each of these tests briefly. You can also learn about the different technical HIT box measurements by watching the video below, which includes demonstrations of each.
The output sound pressure level 90 dB measurement, also known as the OSPL 90, is a measurement which assesses whether the hearing instrument output is appropriate for loud input levels (90 dB SPL). The OSPL 90 measures hearing aid output in dB SPL as a function of frequency in kHz. The hearing aid is set to full on gain for this measurement and uses a 90 dB SPL pure tone sweep from 200 to 8000 Hz.
The key output values to consider are the max OSPL 90 level and the high-frequency average (HFA) level which should be within plus/minus 3 dB and plus/minus 4 dB, respectively. It is also sensible to review the frequency response graph to assess for any unusual morphology, such as low output at specific frequencies.
The full-on gain measurement assesses how the hearing aid performs at full-on gain to a moderate intensity stimulus. This measurement also produces a frequency response graph of hearing aid gain in dB SPL as a function of frequency in kHz.
As with the OSPL 90, you should consider the morphology of the frequency response chart. From this measurement, you can read the maximum gain being produced and conclude at which frequency the most gain is being produced. The HFA level is also used for full-on gain testing and should be within plus/minus 5 dB of the manufacturer’s specifications.
The reference test gain is a hearing aid adjustment which is needed to complete the rest of the remaining test box measurements. The reference test gain is used to set the output of the hearing aid to produce a specific intensity level in response to a 60 dB input level, which is then maintained for many of the following test box measurements.
The reference test gain is measured with a 1600 Hz, 60 dB SPL pure tone, with the hearing aid being adjusted into the pre-defined reference test gain within the hearing aid software.
The harmonic distortion test examines whether the hearing instrument exhibits harmonic distortion, which is when the instrument produces harmonics in the output signal that are not present in the input signal.
The harmonic distortion test measures percentage distortion as a function of frequency. The hearing aid is set to reference test gain for this measurement and uses three stimuli:
This test will perform a separate measurement for each of these three frequencies being used. Each measurement will provide values of ‘Total Distortion’, which should be within 3% to be considered within tolerance.
The above tests are the most commonly used tests in a reduced technical HIT protocol. We will now explore a more comprehensive protocol, which will include a range of other tests in addition to those discussed above. This includes:
Each of these tests are explained below, with the video demonstrating how they are performed and showing the results you may expect to see.
The frequency response measurement records the hearing aid output at reference test gain in response to a 60 dB SPL pure tone sweep (200 to 8000 Hz).
The frequency response measurement will provide information on the hearing instrument’s maximum frequency range and the response to high-band and low-band frequencies. ANSI/ASA S3.22 provides tolerances of plus/minus 6 dB and plus/minus 4 B for high-band and low-band responses, respectively.
As with previous measurements, comparing the output frequency response chart to the chart contained within the manufacturer’s guidance is also a useful check to assess for obvious abnormalities which are not accounted for by the numerical values.
The equivalent input noise determines the level of internal microphone noise in the hearing device. It measures decibels as a function of frequency in reference test gain and uses a 50 dB (IEC) or 60 dB (ANSI) pure tone sweep.
The equivalent input noise test calculates a value of equivalent input noise by subtracting the high frequency average from the measured output noise. It is worth considering the different microphone settings (such as full-directional vs omnidirectional) may produce different equivalent input noise values, making this an excellent method of troubleshooting this advanced hearing aid feature.
The input/output graph shows how the hearing aid applies different gain to different input levels at a single frequency. This describes the compression characteristics of the hearing aid. The measurement will typically start with inputs below 50 dB SPL and steadily increase to greater than 90 dB SPL. It measures dB SPL output as a function of dB SPL input in reference test gain.
The output graph can show us information such as the compression knee point, or the level at which a hearing aid will start applying limiting to the output signal.
The attack/recovery time is measured to show how quickly the hearing aid compressor reacts to an increase and decrease in input signal. It measures dB SPL output as a function of time in milliseconds using the reference test gain. The measurement will provide you with numerical values for the:
You can also use technical HIT measurements to assess the hearing instrument’s battery drain and telecoil performance, which can be an excellent troubleshooting tool. The tests we will explore here are:
The video below demonstrates these measurements being performed, with further information about the tests found below.
For the battery current drain measurement, we're looking to measure the milliamps the hearing aid draws from the battery. It measures milliamps as a function of frequency with the hearing aid in reference test gain.
Manufacturer specifications will typically provide values for drainage for both typical usage and quiescent (at rest). The measurement uses a battery pill accessory to measure the amount of power being drawn by the hearing aid. This test can be useful to undertake should a patient report a significant change in the lifespan of their batteries.
For the coil frequency response and full-on gain response, the purpose is to measure the response of the hearing aid when using a telephone magnetic field simulator or TMFS coil as the input, rather than an acoustic input from the test box loudspeaker.
This measurement records the output in dB SPL as a function of frequency. The hearing aid can be set to the reference test gain or the full-on gain depending on which measurement you are running.
Most children with a mild to moderate hearing loss will perform reasonably well in a classroom setting that is small or quiet. However, if that classroom setting is busier, noisier, or larger, they may find it much more difficult.
For these cases, there are systems that are used to improve the signal-to-noise (SNR) ratio. These systems are often referred to as FM systems or radio aids. Generally, they will work in the same way. The teacher or target speaker will wear a transmitter, usually as close as possible to the mouth, and the child will wear a receiver.
To learn more about how to verify and use FM systems, please refer to the video below.
The Aidapter (Figure 2) is an accessory used for testing receiver-in-the-canal type instruments or thin tube hearing aid technologies, where you can't use standard coupler adapters designed for a BTE hearing aid hook.
Historically, methods of testing these instruments in a test box have included the use of putty to create a seal around the receiver or thin tube. However, the Aidapter can allow for a more consistent method of coupling the hearing aid to the test box equipment.
You can learn more about testing with the Aidapter in the video below.
Given the varied applications of test box measurements in quality assurance, patient monitoring, and informing clinical management decisions, they are important in routine clinical practice. They can help you to deliver positive interventions which are verified, safe, and reliable for the end user.
If you wish to take your learning about HIT measurements a step further, consider enrolling in our free online course: Getting started: Hearing Instrument Testing (HIT).
If you are looking to expand your clinical practice to include test box measurements, explore our hearing aid fitting systems.