RSSI values showing the recieved signal strength of gateway from node?

The first generation of Single Channel Packet Forwarders (mostly DIY stuff) do not have downlink capabilties. ESP based versions allow the owner to ‘fix’ the downlink frequency by ignoring the backend instructions.
So where uplink and downlink capabilities are theoretically separate, in real live when uplink is compromised there is a realistic chance downlink might not conform to the standard either.

1 Like

As I pointed out before, if downlink did not actually work (and I am confident that perhaps unlike some early DIY experiments the LG01 and LG02 downlink does) then these devices wouldn’t have much ability to disrupt the network.

To disrupt the network they need to mislead ADR, by turning in a stronger signal report for a given uplink than any “true” multichannel gateway does. If they turn in the strongest report, they’re overwhelmingly likely to be select for any downlink reply; if they then fail to deliver that downlink, the misleading ADR ack won’t be delivered and the node won’t be mislead. Given they’d only be reporting in a minority of real node’s packets, they’re not likely to skew a time average much either.

As I’ve repeatedly pointed out, these do not belong on the current form of TTN - but if TTN were made safe from them, downlink would not be the problem, because it actually does work on most of them. And where it doesn’t, unlike supporting LoRaWan compliant uplink which is a hardware problem, downlink is a software problem economically fixable. That said, the more I think about a possible “TTN junior” the more I think it would tag (and likely even detect) both gateways and nodes and use them only to interact with eachother.

I made no mention of switching channels to account for dwell time; LoRaWan doesn’t even give a network such option, since apart from the choice of RX1 vs RX2 it’s the node not the network which chooses the frequency.

What I did explain is that for all types of gateways downlink channel setting is arbitrary since the radio is configured at the time of use. A fakeway needing to toggle between uplink and downlink settings is flipping IQ setting even if it’s in Europe and using the same frequency for uplink and downlink. One in other regions would be toggling between uplink and uplink-mod-8 downlink channels (even if only ever one of each), and also changing bandwidth.

To address one part of the OP question that was not covered (much) in the discussion (note that everything else is critical if you are setting up a gateway), RSSI in RF systems is a highly variable measurement. While the actual measurement of RF power is a repeatable process, the environment, particularly in multi-path fading situations, very strongly affects the actual reading, so don’t expect your RSSI values to be highly consistent as the environment changes.

I’ve designed RF radio modules in previous jobs in my career and found that when building the factory test automation systems that the only way to effectively measure the receive sensitivity of an RF receive chain, particularly when a quality LNA (low-noise amplifier) is involved requires a cabled connection to the RF signal generator. Any attempt to couple the signal to an antenna, without taking extreme measures to create a heavily shielded environment with consistent signal absorption, resulted in highly variable measurements (often more than 10 dB difference).

2 Likes