Impact of RSSI offset on AGC behavior in a LoRa gateway

Hi everyone,

I’m trying to better understand the impact of RSSI offset on the RF behavior of a LoRa gateway.

My setup:

  • Gateway based on Semtech SX1301 concentrator with SX1257 RF front-end

  • Reception is done through the gateway

  • Transmission is performed using a LoRa node

My questions are mainly about the RSSI offset parameter (default value: -166):

  1. Does the RSSI offset affect the AGC behavior?
    In other words, can an incorrect RSSI offset lead to a wrong gain adjustment in the RF front-end?

  2. Or is the RSSI offset only used as a post-processing correction, meaning it just adjusts the displayed RSSI value without impacting the internal analog chain (LNA/AGC)?

  3. Is the default value (-166) valid for all setups, or should it be calibrated depending on the hardware (front-end, losses, antenna, etc.)?

  4. If calibration is required, what is the recommended method to determine an accurate RSSI offset?

My concern is the following:
If RSSI offset is involved in AGC decisions, then a wrong value could potentially lead to incorrect gain settings and degraded sensitivity.

Thanks in advance for your insights!

Good questions — these come up a lot when people first dig into the SX1301 internals.

**Short answer: RSSI offset is purely a post-processing correction. It does NOT affect the AGC.**

Here’s the breakdown:

**1. AGC operates on the raw RF signal, not the RSSI offset**

The SX1301’s AGC loop runs inside the FPGA baseband and controls the LNA/mixer gain stages in the SX1257 front-end. It uses the raw received power level at the ADC input to make gain decisions — the `rssi_offset` parameter in `global_conf.json` is never fed back into this loop. So your concern about incorrect RSSI offset degrading sensitivity or causing wrong gain settings is unfounded at the hardware level.

**2. What RSSI offset actually does**

It’s a scalar added in software (in `lora_pkt_fwd` or the equivalent packet forwarder) to compensate for the total insertion loss in the signal path:

- Antenna cable loss

- PCB trace loss

- Filter insertion loss

- Any LNA or attenuator in the chain before the SX1257 RF input

The formula is essentially: `RSSI_reported = RSSI_measured_by_SX1301 + rssi_offset`

The default `-166 dBm` is Semtech’s calibration value for the SX1301 reference design with SX1257 at nominal gain. If your hardware matches the reference BOM closely, it works fine.

**3. When to calibrate**

You’ll need to tweak it if:

- You have external RF components (extra LNA, bandpass filter, attenuator)

- Non-reference PCB layout with different trace lengths

- Different RF front-end (some designs use SX1255 instead of SX1257)

- You’re seeing consistent RSSI discrepancy vs. a calibrated reference

**Calibration method:**

Use a LoRa node with a known TX power (e.g., +14 dBm) at a known distance (e.g., 1 meter in an anechoic environment, or use a calibrated RF attenuator directly connected). Calculate the expected free-space path loss or cable attenuation, then adjust `rssi_offset` until the reported RSSI matches your expectation. A 10-20 dB attenuator pad directly on the RF port is the cleanest approach if you have one.

For most TTN community gateway deployments the default value is close enough — RSSI accuracy of ±3-5 dB is typical and doesn’t meaningfully impact network coverage decisions. The main place it matters is if you’re doing RSSI-based localization or precise link budget analysis.

Hope that clears it up!