Q: ABP and the default channel usage (AS923, EU868) - why?

I’ve been on AU915 more than anything, building nodes. I’ve just thrown up an AS923 GW (in .au so totally legit) and relearnt something I learnt…last year. I don’t understand why this is so though.

It looks to be a significant limitation of ABP (reinforcing the don’t use it message) that the default channels (2 for AS923, and 3 for EU868) are the only ones used in a standards compliant stack. This will surely cause congestion on those channels, which will hurt OTAA attempts too?

I get that OTAA gives the MAC downlink that gives the permissible channels…but are they not always the same (for TTN) so why doesn’t a stack using ABP just choose some a la the sub-bands in AU/US915 as a standard thing? Maybe it’s an AS1 vs AS2 thing for AS923? Or were the standards-folk imagining the server assigning any of the available frequencies based on it’s knowledge of the utilisation?

It would help me remember this difference if I understood the rationale for this soft channel config in the standard. It would also encourage me to switch to OTAA (and should be in the text for “don’t use ABP” everywhere too) :slight_smile:

(from other forum entries and code it looks like a frequent request/solution is just to force set the frequencies, and TTN does use a fixed set. Perhaps that’ll evolve though?).

For ABP the node should be configured to include all channels used by the LoRaWAN provider that will be used. As there is no standard mechanism for the network to provide the channels this is the responsibility of the owner of the node.

Where in AU/US915 the channels are defined in the specification AS923 allows the operator to define all but the two mandatory channels as they see fit. As a result there are only two standard channels and the other 6 (or 14) are operator specific. These channels are fixed for all gateways of the operator to allow devices to move between coverage areas. (Imagine the issues that would occur if different gateways from the same operator were listening at different frequencies. Almost as bad as the AU915/AS923 situation you are already experiencing)

That’s usually a good idea, yes, especially as it would distribute the traffic attempted when out of range of any gateway

Actually, there is. Every LinkAdrReq transmitted from network to node includes a bitmap of allowed channels.

If a node didn’t use ADR that might not apply, but not using ADR means needing to tune the SF to specific operating conditions or modes - eg, you’re already customizing the node so setting the channel plan should be a simple part of it.

For that to work the frequencies of the channels must be predefined which doesn’t work for AS923 as there are only two fixed frequencies and the operator is allowed to choose the other frequencies.

Is it a correct assumption that the (MCCI) LMIC example ttn-abp.ino currently provides insufficient information (code and instructions) for use in all available regions?

Thanks everyone, I learnt a lot from that. I’m going to look up that bitmap on the ADR response to see what/how that’s useful.

My conclusion is that while it’s a #define to swap between OTAA and ABP on the popular stacks, there’s a lot more to it than that, and those subtleties are not discussed in many places - as you say @kersing, for ABP you’re responsible for configuring everything for the local environment. That’s a good baseline to start from, but knowing everything is hard and OTAA makes it easier, and good motivation to use it.

@bluejedi There is some guidance but it could be a little more blunt or better it should use the TTN-defined freqs (like EU868 does). The AS923 AS1 vs AS2 will need another #define though.

2 Likes