What represents the best signal quality, RSSI, SNR or combination of both?

This question relates to the following TTS issue:
Live data - must show best SNR/RSSI pair instead of only values from first gateway #4846

Currently RSSI and SNR values shown in TTS Console Live data are values taken from the first gateway reported in the list.

RSSI and SNR values are therefore not representative for the quality of the signal received (assuming that the message was received by multiple gateways).

As suggested in issue #4864 the RSSI and SNR values shown in Live data should be the best RSSI/SNR pair from all gateways that received the message.

How would ‘best signal’ be determined from RSSI/SNR values from different gateways?
Which pair is the most representative for the ‘best signal’ quality?

Depends on what you mean by ‘quality’.

If you want to know how close reception is to failure, dont bother with RSSI, just read the SNR.

What I mean by quality boils down to: what is the best signal received?

From which gateway should RSSI/SNR values be displayed in console Live data when multiple gateways have received the message and RSSI/SNR values are displayed from one single gateway only, which one represents the best signal?

RSSI, on its own, tells you zilch about how close a received signal is to failure.

Bottom line is for reception at a gateway do you even care? If signal received and message ok (passes MIC checks, errors recovered under FEC, etc. ) it will get passed on and 1st to report is good enough. If it fails then that gw wont report and next will! For any given SF there is a min required SNR for reliable signal extraction, and as might be expected given the modulation resilience then the higher the SF the lower the SNR can be.

When it comes to NS comms back to a node the NS has relative ages (vs speed of light differences between GWs ) to parse all the valid received messages passed on by the various GWs and can select the optimum one to route the downlink back.

TL:DR which gw get message 1st is moot as either it gets through or it doesn’t…so why should back end spend time sorting/adjusting info for display when it will have already reported message metadata on console before it needs to schedule the downlink (which is the point where ‘quality’ will need to be known.)

Please stick to my original question.

I hope someone can give a substantive answer.

@Jeff-UK Try the other way around: why/when would showing RSSI and SNR be useful and valuable?

When you see several/many signals close to the reception limit then you know you are in poor reception conditions or a marginal reception area and it is time to densify the network, improve the gateway position improve the node ant/position or many of the above. This comes in useful when doing e.g. gps tracking to determine overall area coverage and signal mapping. You may see signal strong in many areas but by heatmapping you can see where you start to go into poorer areas and indeed areas of no coverage are obvious - no signal! - but where does that become a risk? Mechanisms such as ADR have built into the algorythm a network determined ‘safety margin’ beyond/below which there is then considered to be increased risk of a packet loss. IIRC general recomendation is to aim for atleast 6-9db margin but have seen some deployments where use case allowed closer to 3db as tolerant to loss for sustained periods and others where they want ‘reliable’ comms and set >15db. Again IIRC I think TTN is set quite high at somewhere around 10-15db, no doubt someone familar with the source code will have looked or can look and confirm if no one from TTI comments. Having SNR heatmapped is useful as may highlight areas with known strong interferers that can either be avoided or mitigated against - again in once case I saw user drop in a spare TTIG to give infill coverage that helped mitigate a noisy (industrial) neighbour! Obviously, odd spurious noise spikes are difficult to mitigate against but where there is a consistent high level noise floor this can work quite well :slight_smile:

If the 1st listed receiving GW looks marginal then it is worth digging into some of the reception metadata and asking the question ok - if Gateway X - the 1st to show - deteriorates or goes off line should I worry or can I be confident that there are others in the reception list that will catch anything if GW X misses, and indeed may even then show a stronger signal - and most likely that would be the one chosen for downlinks anyhow vs GW X as signal stronger if not 1st. To that end as GW’s proliferate have also seen situations where for months (years!) always say a great signal from a couple of nodes, then suddenly the signal levels fell (could see when looking historically at graphs of RSSI & SNR in Cayenne. Was not an issue as messages still got through and there was no increase in packet loss…turned out someone had deployed an indoor gw around half way between nodes and my GW and was now the 1st to show reception in Console! My nice expensive 20m telecoms mast mounted outdoor GW was still in the reception list but it was the indoor GW that was handling uplink 1st (most of the time)…interestingly 9/10 downlinks still went via my GW vs the indoor one (10-18db higher RSSI and 3-5db better SNR) :wink:

Did you page me??

It’s actually a console configurable item now …

I do that with an ML model because I’m the geekiest geek in the geekverse.

Not directly - I know you , Chris and many other Frumites into the Source code _ I’ve never looked…'cause… :wink:

Yep that can work - indeed if all GW’s show significant drop at same time and for sustained period that’s a red flag that there may be something wrong with the node or something changed in local environment! (Classic is the builders skip dropped on top of the water meter in the drive :wink:

The model, sadly, needs some devices to disappear before it knows stuff - it’s not best suited to a sudden random change like a skip, it’s more about identifying devices that are at risk if a gateway in a devices area goes down / disappears that potentially compromises the overall reception.

SNR.

To know the difference you have to bring in interference, phase noise of receiver, spurious and even Same channel signals.

In all these cases RSSI would be good but SNR would be low.

Even for a compressed input with very high RSSI, SNR would be low.

The strength of the signal never says how good the contents are. But yes the higher the RSSI in a good modulated clean environment reception end, the better the receive quality