Rssi vs channel_rssi - what is the difference?

As seen here:

        "gateway_ids": {
          "gateway_id": "descartes-public-ttig",
          "eui": "58A0CBFFFE802BBE"
        "time": "2022-02-03T13:25:47.478770017Z",
        "timestamp": 3484061372,
        "rssi": -60,            <-------
        "channel_rssi": -60,    <-------
        "snr": 8.75,
        "location": {
          "latitude": 53.57031588860688,
          "longitude": -2.460586780216545,
          "altitude": 50,
          "source": "SOURCE_REGISTRY"

Before I ask a grownup, does anyone know what the difference might be?

The difference between -60 and -60 ?

0 ?

1 Like

You know what I mean!

With SX1302 and newer base band chips the packet_forwarder reports an additional RSSI value.
Please refer to fie at

 Name |  Type  | Function
 rssi | number | RSSI of the channel in dBm (signed integer, 1 dB precision)
 rssis| number | RSSI of the signal in dBm (signed integer, 1 dB precision)
 lsnr | number | Lora SNR ratio in dB (signed float, 0.1 dB precision)

If I am not mistaken rssis value takes in account the SNR value.

If you are using an SX1301 as base band chip only the rssi value is reported. I suppose that in this case the TTN json just copies the same value to rssi and channel_rssi

1 Like

“channel_” :slight_smile:

Is that not the lsnr?

Yes, this as I understand it - just double checked around 5 messages across >15 GWs - mix of 1301 & 1308 based (1308 is 1301 derivative) and value replicated.

OK, but what does it mean in pragmatic terms. If it is different or changes in a particular way, what does that mean and when should we pay attention to it.

The channel RSSI will measure the power of the whole radio input: signal+noise while the signal RSSI will only measure the power of the signal: so you will typically only see channel RSSI >= signal RSSI, and when the SNR is positive there should be no difference between both (which is why you -60 for both since your SNR is 8dB).
It is kind of proxy for the SNR when SNR is negative, but it is measure differently from the SNR, so when the noise is not white, like narrow band interferer for example, you might see that a measured SNR negative s different from the difference of the two RSSI. So it could be something to look at when a link does not behave as expected in term of performance.

In order to be sure I have checked with my colleagues the meaning of these values.


  • The rssi is measured 1 time after the LoRa header has been demodulated.
    The measured value is always above the noise floor.


  • The rssi is the averaged result of the RSSI measurements over the full packet reception.
    The measured value is always above the noise floor.
  • The rssis is also an averaged result of the RSSI measurements over the full packet reception. However the measured SNR is also taken in account following an equation similar to rssis = rssi + lsnr
    Example: lsnr = -10 dB and rssi = -125 dBm => rssis = -125 + ( -10 ) = -135 dBm
    The measured value can be below the noise floor.

@htdvisser, can you advise on the nomclementure as suggest by @mluis1?

I know rssi = channel_rssi as per and if channel_rssi or rssi will be calculated differently at some point?

Are the snr and lsnr the same value?

I see in some packets although the snr is negative the rssi and channel_rssi stays the same.

      "rssi": -117,
      "channel_rssi": -117,
      "snr": -14.5,

Which gateway chipset types is this with Johan? (SX1301/08/02/03?)

Not 100% sure but think it is a MikroTik wAP LoRa8, the gateway is " tech5-alverstone-02-dbn".

I also have rssi and channel_rssi when using a GW with SX1301. I think it might have something to do with the type of packet forwarder being used. I have previously tested multiple different GW models using SX1301 with both UDP packet forwarder and LBS, and back then I noticed some difference in the metadata. I also think when using UDP that you would get something called channel_index, which I did not get using LBS.

I have only 1 GW with SX1302 but I’m getting the same result as with SX1301 with LBS. I have not tested my SX1302 GW with the UDP packet forwarder.

The rssi and channel_rssi fields are exactly the same. The fact that they both exist is because of API evolution and indeed because different gateway implementations (used to) measure these things slightly differently. Since we try to keep our API backwards compatible, we don’t want one field to suddenly mean something else, we don’t want to rename fields, and we don’t want fields to disappear. So ideally we’d get three values from gateways: rssi (using the old SX1301 measurement method) and rssic+rssis (using the SX1302 measturement method), but we don’t live in an ideal world. Some gateways send rssic (translates to rssi_channel) and rssis (translates to rssi_signal). Other gateways send rssi (translates to rssi) and optionally rssis (translates to rssi_signal). In the first case, we copy rssi_channel to rssi, and in the second case the other way around. If this does not happen consistently, we should probably look into that.


Thanks for that, I can put notes in the code to explain why there is two and just use one.

I’ve not seen any uplinks where they are different but I’ve not been doing much in that area of late but as I’m updating my webhook it will be easy to query for that after I’ve got some readings in.

I’m good - if I needed the totally definitive detail I’d parse the gateway logs.

The fact that seriously heavy rain or the refuse lorry can alter the RSSI for a short time means getting too obsessed with the numbers is a bit moot.