Reading Time: 7 minutes
In an ideal environment, RF receivers would be placed as close to the transmitting entity as possible. RF receiver sensitivity testing exists because not only is that not the case, it would eliminate the benefit of wireless communications period. RF receiver sensitivity testing evaluates a receiver’s ability to process weak signals while maintaining reliable communication. The goal is to determine the lowest input signal power at which the receiver still meets predefined performance criteria. In this guide, we will cover:
- What the major certifying bodies require for receiver sensitivity
- The benefits of better sensitivity ratings and factors that influence them
- How RF receiver sensitivity is tested to meet certification goals in key markets
An Overview of RF Receiver Sensitivity and Factors Influencing Ratings
RF receiver sensitivity is a critical performance metric that defines how well a radio frequency (RF) receiver can detect weak signals while maintaining an acceptable signal-to-noise ratio (SNR) and bit error rate (BER) or packet error rate (PER). It is typically measured in decibel-milliwatts (dBm), with lower values (more negative) indicating better sensitivity.
A more sensitive receiver can successfully demodulate weaker signals, making it particularly important for applications in wireless communication, radar, satellite systems, and IoT devices. More specifically, lower sensitivity improves communication ranges, reduces power consumption by minimizing repeat transmissions, and enhances reliability in environments where consistent communication is crucial.
Common Requirements for Receiver Sensitivity
Receiver sensitivity requirements vary by application. Some examples that include ranges across various market authority requirements include:
| Technology | Typical Sensitivity Range |
| Wi-Fi (802.11) | -90 dBm to -75 dBm |
| Wi-Fi (802.11) DFS | -65 dBM or less |
| Bluetooth | -85 dBm to -95 dBm |
| NR 5Gen | -100 dBm to -120 dBm |
| GPS | -125 dBm to -135 dBm |
| Satellite Communication | -140 dBm or lower |
While the specific numbers vary based on certifying authority and relevant standard, it’s relatively easy to note the logarithmic correlation between the distances covered by RF transmission for each technology and the required sensitivity range for devices operating on those protocols. However, while there are ranges, these are the case because of the many factors that impact device sensitivity.
Factors Affecting RF Receiver Sensitivity
Several factors impact a receiver’s sensitivity:
- Thermal Noise – All electronic components generate noise, which limits sensitivity.
- Interference – Other RF signals can degrade the receiver’s ability to detect weak signals.
- Receiver Noise Figure (NF) – A lower noise figure improves sensitivity.
- Antenna Quality – A poorly designed antenna can reduce signal strength before it reaches the receiver.
- Modulation Scheme – Some modulation techniques require a higher SNR, impacting sensitivity.
- Filtering and Amplification – Proper filtering and low-noise amplification help improve signal reception.
RF Receiver Sensitivity Testing for FCC, ISED (IC), and ETSI Compliance
Testing RF receiver sensitivity is crucial for compliance with FCC (U.S.), ISED (Canada), and ETSI (Europe) regulations to ensure that devices can maintain reliable communication at low signal levels.
FCC Receiver Sensitivity Testing
Regulatory Standards: FCC Part 15, 22, 24, 27
Test Procedure
| Test Setup |
| A signal generator transmits a modulated RF signal to the Device Under Test (DUT).A variable attenuator gradually reduces the signal strength.A bit error rate tester (BERT) or packet error rate (PER) tester records received data.A spectrum analyzer monitors unwanted interference. |
| Test Execution |
| The input signal strength is decreased until the Bit Error Rate (BER) or Packet Error Rate (PER) exceeds an allowable threshold. The minimum signal level at which the DUT maintains BER ≤ 0.1% or PER ≤ 1% is recorded as the receiver sensitivity. |
| Pass/Fail Criteria |
| Sensitivity must meet minimum receiver sensitivity limits based on the technology: Wi-Fi (802.11ac): -82 dBm for 64-QAMLTE (FCC Part 27): -101 dBm for Band 1Bluetooth (FCC Part 15.247): -70 dBm |
ISED (IC) Receiver Sensitivity Testing (Canada)
Regulatory Standards: RSS-Gen, RSS-132, RSS-133
Test Procedure
| Test Setup |
| Identical to FCC testing, using a signal generator, BER/PER tester, and spectrum analyzer. It may also include environmental interference simulation. |
| Test Execution |
| The DUT is tested across all its operating frequencies. Then, the minimum sensitivity threshold is measured at a BER of ≤ 0.1%. |
| Pass/Fail Criteria |
| Sensitivity must meet ISED minimum requirements, typically equal to FCC limits. For example, devices using Wi-Fi (RSS-247) would need a minimum sensitivity rating of -82 dBm and LTE devices would need a minimum sensitivity rating of -101 dBm. |
ETSI Receiver Sensitivity Testing (Europe)
Regulatory Standards: EN 300 328, EN 301 908, EN 303 204
Test Procedure
| Test Setup |
| Same as FCC/ISED but with additional adjacent channel interference sources. The protocol also uses narrower power step increments to determine precise sensitivity. |
| Test Execution |
| The DUT receives progressively weaker signals while being subjected to adjacent-channel interference. The receiver must still decode the signal with a BER ≤ 0.1%. |
| Pass/Fail Criteria |
| More stringent than FCC/ISED, particularly in blocking tests. Example sensitivity minimums include: Wi-Fi (EN 300 328): -82 dBmWi-Fi (EN 301 893): -64 dBmLTE (EN 301 908): -100 dBmLoRaWAN (EN 303 204): -137 dBm |
Ensuring Accurate RF Receiver Sensitivity Testing: Best Practices of a Leading RF Testing Lab
Since receiver sensitivity directly impacts a device’s ability to operate in low-signal environments, a testing lab’s setup must account for signal purity, interference mitigation, environmental stability, and instrumentation accuracy. Below are some best practices that leading labs use to perform sensitivity testing.
Establishing a Controlled RF Testing Environment
To obtain reliable receiver sensitivity data, an RF testing lab must eliminate variables that could distort signal measurements. This is achieved through:
Anechoic Chambers
RF receiver sensitivity tests are conducted in an anechoic chamber, which is lined with radio wave-absorbing material to prevent signal reflections. The chamber isolates the Device Under Test (DUT) from external RF signals, ensuring that measured performance is solely due to the test conditions.
Temperature and Humidity Control
Temperature fluctuations can affect receiver noise figures and RF front-end performance, leading to inconsistent results. High-precision climate control maintains a stable test environment, particularly for high-frequency bands like mmWave (24 GHz+), where performance is temperature-sensitive. Humidity control is crucial for reducing RF signal absorption in the air, particularly in sub-6 GHz bands used in Wi-Fi, LTE, and IoT networks.
Utilizing High-Precision Test Equipment
Testing labs rely on calibrated, high-precision instrumentation to generate test signals and measure receiver sensitivity. Essential components include:
Signal Generator with Low Phase Noise
A high-stability RF signal generator transmits a modulated test signal at varying power levels. The generator must have an ultra-low phase noise profile to prevent artificial errors in sensitivity measurements. Advanced test labs use vector signal generators (VSGs) capable of generating signals with real-world impairments (e.g., multipath fading, interference).
Variable RF Attenuator
An automated variable attenuator reduces the input signal in controlled steps, allowing precise determination of the receiver’s lowest operational threshold. Top-tier labs employ programmable step attenuators that allow fine adjustments in 0.1 dB increments, ensuring highly granular measurements.
Bit Error Rate (BER) or Packet Error Rate (PER) Tester
A BER tester (for digital signals) or PER tester (for packet-based communication protocols like Wi-Fi and Bluetooth) determines whether the receiver meets the minimum sensitivity threshold. The lab measures BER at progressively weaker signal levels, stopping when the receiver reaches an error rate of 0.1% (industry standard threshold).
Spectrum Analyzer with Pre-Selectors
A high-resolution spectrum analyzer detects unintended emissions and adjacent-channel interference that could corrupt test results. Pre-selectors and bandpass filters are used to isolate intended signals from spurious noise and harmonics.
Defining the RF Receiver Sensitivity Test Procedure
Once the environment and equipment are set up, the lab follows a standardized testing methodology based on industry regulations:
- Calibration and Baseline Measurements
All test instruments undergo pre-test calibration using a reference receiver with a known sensitivity benchmark. The device under test is configured in its normal operational mode, with software adjustments disabled to avoid artificial enhancements.
- Test Execution
The signal generator transmits a known modulated signal at the receiver’s operating frequency. The attenuator reduces the signal strength in precise increments until the receiver fails to maintain BER ≤ 0.1% or PER ≤ 1%. The minimum detectable power level is recorded as the receiver sensitivity threshold.
- Adjacent Channel Interference Simulation
To ensure compliance with ETSI and 3GPP standards, the lab introduces adjacent-channel interferers to simulate real-world network congestion. The receiver must maintain functional operation while rejecting interference from nearby frequency bands.
- Statistical Averaging and Verification
Sensitivity results are averaged over multiple test cycles to remove statistical anomalies. The test is repeated across temperature and humidity ranges to ensure environmental resilience.
Compliance Verification and Final Reporting
Once testing is complete, the lab compiles a comprehensive report including:
- Receiver Sensitivity Measurement: The lowest power level at which the receiver maintains stable communication.
- Error Rate Data: BER/PER graphs showing how sensitivity degrades as signal strength decreases.
- Interference Resilience Tests: Adjacent-channel rejection levels and blocking performance metrics.
- Compliance Status: Whether the receiver meets FCC, ETSI, ISED, or 3GPP regulatory requirements.
Partner with the Leading RF Sensitivity Testing Lab
MiCOM Labs offers more than 20 years of experience with RF testing expertise best described as “inch wide, mile deep”. Along with the ability to provide certification tests for numerous target markets, MiCOM also includes the MiTest® automated testing and report generation platform allowing quick and easy access to results and reports from any point on the globe. Along with other innovations, this family-driven, independent lab is bringing RF testing into the 21st century. To learn more about RF sensitivity training and its role in global device certification, get in touch with the team at MiCOM Labs. Use the contact form to start the conversation, or call us directly at +1 (925) 462-0304.